Privacy protections are built in for users who access ChatGPT — their IP addresses are obscured, and OpenAI won’t store requests. ChatGPT’s data-use policies apply for users who choose to connect their account.
As noted above, we call model providers on your behalf so your personal information (for example, IP address) is not exposed to them. In addition, we have agreements in place with all model providers that further limit how they can use data from these anonymous requests that includes not using Prompts and Outputs to develop or improve their models as well as deleting all information received once it is no longer necessary to provide Outputs (at most within 30 days with limited exceptions for safety and legal compliance).
Yeah there’s definitely a contract, but open ai could determine it’s more profitable to void the contract and pay for lawyers and a settlement. Probably unlikely though to be fair.
With the sheer amount of money that the rich are throwing at OpenAI via investment firms, they don’t need nor want to charge imo. The fact that they’re being built into Apple’s ecosystem and are getting name-dropped to people inside of iOS is kinda what their investors want.
It’s the age old “walmart opens and operates at a loss for 2 years to force others out of business, then jacks the price” model.
Investors want them to cement this as The AI company & brand so that once it gets giant and starts to be profitable just by being the biggest gorilla in the room, the shares they bought are worth more.
So what I’m trying to say is that our version of capitalism is perfect and makes lots of sense and is in no way insane and degenerate.
That might be their internal reasoning but Apple will very quickly move to have these capabilities in house. Apple has been working on machine learning for a while but they don’t collect data so they are unable to build these LLMs.
For now it makes sense for Apple to leave the liability of basing these LLMs on copyrighted data. If OpenAI losses those court battles, they take the hit for services rendered to Apple. None of that liability transfers to Apple.
Meanwhile, Apple is going about this the Apple way by encouraging developers to integrate their apps into new frameworks being added. This gives them access to user data directly from the source allowing them to build personalized models.
These models will likely be far more useful to the day to day mundanity of life than the hallucinogenic encyclopedia that is ChatGPT.
Nothing says “We’re confident in the software we’re selling” like willing to work for exposure in hopes that somebody shills $20 for a subscription.
Exposure = They get to keep the data they get.
Data = money
They’ve found a way to make it work I’m sure.
Apparently they won’t be collecting data.
https://www.apple.com/newsroom/2024/06/introducing-apple-intelligence-for-iphone-ipad-and-mac/
They pinky promise not to collect the data
I mean Apples the one saying it. I doubt OpenAI wants to piss them off.
How would Apple technically know that nothing is stored? Even if IPs are hidden, the questions and answers are still useful for OpenAI.
They probably have a deal similar to DuckDuckGo:
Yeah there’s definitely a contract, but open ai could determine it’s more profitable to void the contract and pay for lawyers and a settlement. Probably unlikely though to be fair.
I don’t believe that at all! There’s some kind of catch.
Even if they don’t sell the data I’m sure it’s valuable for their models.
This seems so obvious I’m amazed Apple isn’t charging them for the “exposure”
Apple needs them just as badly as OpenAI needs every iPhone user. They’re horribly behind in this “AI” bubble.
With the sheer amount of money that the rich are throwing at OpenAI via investment firms, they don’t need nor want to charge imo. The fact that they’re being built into Apple’s ecosystem and are getting name-dropped to people inside of iOS is kinda what their investors want.
It’s the age old “walmart opens and operates at a loss for 2 years to force others out of business, then jacks the price” model.
Investors want them to cement this as The AI company & brand so that once it gets giant and starts to be profitable just by being the biggest gorilla in the room, the shares they bought are worth more.
So what I’m trying to say is that our version of capitalism is perfect and makes lots of sense and is in no way insane and degenerate.
That might be their internal reasoning but Apple will very quickly move to have these capabilities in house. Apple has been working on machine learning for a while but they don’t collect data so they are unable to build these LLMs.
For now it makes sense for Apple to leave the liability of basing these LLMs on copyrighted data. If OpenAI losses those court battles, they take the hit for services rendered to Apple. None of that liability transfers to Apple.
Meanwhile, Apple is going about this the Apple way by encouraging developers to integrate their apps into new frameworks being added. This gives them access to user data directly from the source allowing them to build personalized models.
These models will likely be far more useful to the day to day mundanity of life than the hallucinogenic encyclopedia that is ChatGPT.