It looks like some of Apple’s most loyal customers aren’t too happy with its policy regarding generative AI.
This year has marked a major milestone for the leading Cupertino firm that opted to unveil its collaboration with the makers of ChatGPT, not to mention the rollout of Apple Intelligence. And yes, the iPhone maker has officially entered the much-heated AI race after taking a backstep for months.
The company confirmed how this year will see devices running on Apple Intelligence which is the name reserved for the tech giant’s version of generative AI. Amongst the many exciting features is the ability to produce pictures using text prompts.
While the news might excite many Apple users, it’s leaving a bad taste for creative community members who are surprised at how much lackluster surrounds the policy regarding AI models. Amongst their many concerns is zero transparency about how the models were trained and which data was used.
Many feel this is a fundamental right that every Apple user deserves to know and not seeing that is disappointing for many arising from the community of creative artists whose livelihood depends on personalized content creation.
For years, these members of society have looked up to Apple as a pioneer in the world of tech as well as liberal arts. Now, these same people are expressing frustration about how silent Apple has become in terms of how it attains data for use in AI models.
Generative AI’s success truly depends on what kind of information or training data was used to bring it up to the point of launch. So many firms continue to ingest data available online, refusing to take consent or provide compensation to the original creators.
The fact that Apple may have done what the others did is sad for some people to see, especially those having blind faith in the iPhone maker for decades.
To help give you a better understanding of what we mean, close to six billion pictures were used to train AI models such as LAION-5B. And all of those were stolen from the web and no one is calling the firm out for such acts.
This is why the creative community featuring music artists and authors, as well as those who deal with art daily, have united to take a stand against anyone consuming their efforts for free and keep generating more profits out of it.
We saw top music labels such as Sony and Universal sue AI music startup firms for breaking copyright claims. Today, we’re seeing tech giants strike deals through licensing with those producing content online like news publishers to stay safe from legal matters.
Seeing Apple do the same is something bothering the community who expected so much more from the Cupertino company while others speak about giving the iPhone maker the benefit of the doubt when they shouldn’t have.
There is no training source revealed for Apple Intelligence so far nor did Apple agree to doing so. Its company blog post says it used the same means that others in the industry do which is to scrape public data online through AppleBot. Are you shocked?
Image: DIW-Aigen
Read next: Threads Celebrates First Anniversary With Milestone 175 Million Monthly Active Users
by Dr. Hura Anwar via Digital Information World
No comments:
Post a Comment