Thursday, June 12, 2025

Context, Emotion, and Biology: What AI Misses in Language Comprehension

As meaning-makers, we use spoken or signed language to understand our experiences in the world around us. The emergence of generative artificial intelligence such as ChatGPT (using large language models ) call into question the very notion of how to define “meaning.”

One popular characterization of AI tools is that they “understand” what they are doing. Nobel laureate and AI pioneer Geoffrey Hinton said: “What’s really surprised me is how good neural networks are at understanding natural language — that happened much faster than I thought …. And I’m still amazed that they really do understand what they’re saying.”

Hinton repeated this claim in an interview with Adam Smith , chief scientific officer for Nobel Prize Outreach. In it, Hinton stated that “ neural nets are much better at processing language than anything ever produced by the Chomskyan school of linguistics .”

Chomskyan linguistics refers to American linguist Noam Chomsky’s theories about the nature of human language and its development. Chomsky proposes that there is a universal grammar innate in humans, which allows for the acquisition of any language from birth.

I’ve been researching how humans understand language since the 1990s, including more than 20 years of studies on the neuroscience of language. This has included measuring brainwave activity as people read or listen to sentences . Given my experience, I have to respectfully disagree with the idea that AI can “understand” — despite the growing popularity of this belief.

Generating text

First, it’s unfortunate that most people conflate text on a screen with natural language. Written text is related to — but not the same thing as — language.

For example, the same language can be represented by vastly different visual symbols. Look at Hindi and Urdu, for instance. At conversational levels, these are mutually intelligible and therefore considered the same language by linguists . However, they use entirely different writing scripts. The same is true for Serbian and Croatian . Written text is not the same thing as “language.”

Next let’s take a look at the claim that machine learning algorithms “understand” natural language. Linguistic communication mostly happens face-to-face, in a particular environmental context shared between the speaker and listener, alongside cues such as spoken tone and pitch, eye contact and facial and emotional expressions.

The importance of context

There is a lot more to understanding what a person is saying than merely being able to comprehend their words. Even babies, who are not experts in language yet, can comprehend context cues .

Take, for example, the simple sentence: “I’m pregnant,” and its interpretations in different contexts. If uttered by me, at my age, it’s likely my husband would drop dead with disbelief. Compare that level of understanding and response to a teenager telling her boyfriend about an unplanned pregnancy, or a wife telling her husband the news after years of fertility treatments.

In each case, the message recipient ascribes a different sort of meaning — and understanding — to the very same sentence.

In my own recent research , I have shown that even an individual’s emotional state can alter brainwave patterns when processing the meaning of a sentence. Our brains (and thus our thoughts and mental processes) are never without emotional context , as other neuroscientists have also pointed out .

So, while some computer code can respond to human language in the form of text, it does not come close to capturing what humans — and their brains — accomplish in their understanding.

It’s worth remembering that when workers in AI talk about neural networks, they mean computer algorithms, not the actual, biological brain networks that characterize brain structure and function. Imagine constantly confusing the word “flight” (as in birds migrating) versus “flight” (as in airline routes) — this could lead to some serious misunderstandings!

Finally, let’s examine the claim about neural networks processing language better than theories produced by Chomskyan linguistics. This field assumes that all human languages can be understood via grammatical systems (in addition to context) , and that these systems are related to some universal grammar.

Chomsky conducted research on syntactic theory as a paper-and-pencil theoretician. He did not conduct experiments on the psychological or neural bases of language comprehension. His ideas in linguistics are absolutely silent on the mechanisms underlying sentence processing and understanding.

What the Chomskyan school of linguistics does do, however, is ask questions about how human infants and toddlers can learn language with such ease , barring any neurobiological deficits or physical trauma.

There are at least 7,000 languages on the planet , and no one gets to pick where they are born. That means the human brain must be ready to comprehend and learn the language of their community at birth.


Regardless of where a child is born, the human brain is capable of acquiring any language.(Unsplash/tommao wang), CC BY

From this fact about language development, Chomsky posited an (abstract) innate module for language learning — not processing. From a neurobiological standpoint, the brain has to be ready to understand language from birth.

While there are plenty of examples of language specialization in infants , the precise neural mechanisms are still unknown, but not unknowable. But objects of study become unknowable when scientific terms are misused or misapplied. And this is precisely the danger: conflating AI with human understanding can lead to dangerous consequences.

This post was originally published on TheConversation.


by Web Desk via Digital Information World

Everyday Habits That Quietly Threaten Your Hearing Health

Many people could be putting their hearing at risk without even knowing it, just by doing the same things they do every day.

From vacuuming the living room to listening to music on a morning commute, common habits are being flagged by sound experts as potential sources of long-term hearing loss. Although the effects are often gradual, the damage can build up over time.

Experts from DECIBEL warn that once hearing is damaged, it doesn’t tend to come back. The Royal National Institute for Deaf People estimates that nearly 18 million people in Britain live with some level of hearing loss, much of it preventable.

Hairdryers and Hoovers

Some of the biggest culprits can be found in almost every home. Hairdryers, often held close to the ear, can hit between 80 and 90 decibels. That’s not far off the noise level of city traffic, and if used daily, it adds up. One suggestion is to reduce how often you wash your hair, or to take breaks when drying, especially if your model doesn’t come with any noise-reducing features.

Vacuum cleaners tend to hover between 70 and 85 decibels. While that might seem harmless, cleaning for long stretches without ear protection can still do harm. Standing further back from the machine, or limiting usage time, can help ease the load on your ears.

Blenders are even louder. Some can briefly reach 100 decibels, the sound level of a motorcycle, and though they’re only used in short bursts, regular exposure can take its toll. Experts advise stepping back from the counter or wearing basic earplugs when using one repeatedly.

Tools That Talk Back

Out in the shed or garden, the noise doesn’t let up. Petrol lawnmowers and electric saws are both known for their roar. In fact, some power tools can go well past 100 decibels. Just 15 minutes of use may be enough to start damaging the tiny hairs inside the inner ear that are vital for hearing.

To reduce the risk, it’s wise to take breaks, avoid working in confined spaces, and wear proper ear protection, not just the foam plugs, but over-the-ear defenders if possible.

Turning It Up Too Loud

Music is often seen as a comfort, but turning the volume up to block out the outside world can backfire. Using headphones at high volumes, especially in noisy environments like trains or gyms, increases the risk of hearing loss. The sound waves are delivered straight into the ear canal, and when the volume exceeds safe levels, they can damage the delicate hair cells that send sound signals to the brain.

There’s also evidence that overexposure to loud music can affect the way nerves transmit those signals, making it harder to understand speech even when no damage appears on a hearing test.

To reduce the harm, listeners are encouraged to use noise-cancelling headphones, take five-minute breaks every hour, and keep volume settings below 60 percent. Many phones give alerts when the sound level climbs too high, it’s worth paying attention to those warnings.

At concerts or clubs, the same rules apply. Earplugs designed for music fans don’t block out the experience, but they do take the edge off. It also helps to stay away from the speakers and step outside for fresh air now and again.

On the Road

Another source of noise that often gets overlooked is driving at speed with the windows down. The wind, tyre noise and engine rumble can easily push sound levels beyond 85 decibels, especially on high ways and motorways. Over long journeys, that exposure builds up.

The solution is simple, keep windows shut when driving fast or in heavy traffic, and avoid drowning out the road noise with even louder music. For those on motorbikes or bicycles, helmets with built-in sound protection are a smart investment.

A Word on Cotton Buds

While not a noise issue, the use of cotton buds is another habit that can affect hearing. Many people still clean their ears this way, but it often does more harm than good. Pushing wax deeper inside can cause blockages, while scraping the sensitive skin may lead to infections or even a perforated eardrum.

Ears usually clean themselves, so there’s no need to dig deep. Putting anything too far inside can push wax further in or cause damage, so it’s best to be gentle and stick to cleaning only the outer part.

Small Changes, Big Difference

Protecting your hearing doesn’t mean avoiding all noise, just being smarter about how much and how often you're exposed to it. Simple steps like turning the volume down, stepping away from noisy appliances, or wearing proper protection can make a lasting difference.

Nature, too, offers a quieter rhythm — one that reminds us that not every moment needs to be filled with noise. Learning to listen more carefully, and more gently, could be the best habit of all.

Image: DIW-Aigen

Read next: Meta’s AI App Shows a Side of the Internet That Few Asked to See
by Irfan Ahmad via Digital Information World

Wednesday, June 11, 2025

Wikipedia Halts AI Summary Trial After Editors Raise Concerns

Wikipedia has paused a trial that used artificial intelligence to write short summaries of its articles, following complaints from volunteer editors.

As per 404Media, the summaries were part of an experimental feature made available earlier this month to users who had a special browser extension and had chosen to take part. They appeared at the top of articles, but were hidden behind a click and marked with a yellow label reading “unverified”.

However, the test was met with immediate pushback from within the Wikipedia community. Editors said the summaries could mislead readers or contain errors, potentially damaging the website’s credibility.

One of the main concerns was the risk of AI producing inaccurate information — a problem often referred to as "hallucination", where the software invents facts or misrepresents them. Other publishers, including Bloomberg, have faced similar issues. Some have had to correct mistakes or scale back their own AI experiments as a result.

Wikipedia has said the feature is now on hold but hasn't ruled out the use of AI entirely. The Wikimedia Foundation, which oversees the platform, says it is still exploring how AI might help make the site more accessible — but insists any future tools must be accurate and trustworthy.


Image: DIW

Read next:

• Crypto Search Surge Places New York at the Forefront of U.S. Digital Currency Interest

• New WhatsApp Feature Summarizes Unread Chats Using Local AI, Bypassing Cloud-Based Data Handling
by Irfan Ahmad via Digital Information World

Crypto Search Surge Places New York at the Forefront of U.S. Digital Currency Interest

A new report examining internet search behavior across the United States has found that New York is the most interested state when it comes to cryptocurrency. The study, carried out by online crypto retail platform Zellix, identified the Empire State as leading the country in monthly online searches for crypto-related terms, suggesting a high level of curiosity and engagement with digital currencies.

According to the findings, New York residents carry out an average of 298 searches per 100,000 people each month for phrases linked to the world of crypto. This puts the state ahead of others including Nevada and California, which came in second and third place respectively.

The researchers behind the report analyzed Google search data for a wide range of cryptocurrency terms – from investment queries such as “what crypto to buy today” to more platform-specific phrases like “Coinbase login” or “Kraken Crypto”. The number of searches was then adjusted according to population size in order to reflect relative interest across the 50 states.

Nevada followed closely behind New York, with roughly 258 searches per 100,000 people, while California, home to Silicon Valley and several major tech firms, averaged just over 252. Hawaii and Alaska completed the top five, with figures suggesting that even states with smaller populations are demonstrating strong levels of engagement with the crypto space.

Where Curiosity Runs High

In several of the highest-ranking states, search terms such as “NFT”, “how to invest in crypto”, and “which cryptocurrency will explode in 2025” appeared frequently. Data suggests that much of the online interest is driven by a mix of speculative enthusiasm, basic education, and account management activity.

Florida, North Dakota, Washington, New Jersey and Massachusetts also made it into the top ten, with average monthly search figures ranging from just over 220 to 232 per 100,000 residents.

At the other end of the spectrum, Mississippi recorded the lowest level of interest, with around 124 searches per 100,000 people. Other states with relatively low search volumes included Kentucky, West Virginia and Louisiana – all of which fell below 135 monthly searches per 100,000 residents.

Political Patterns and Cultural Clues

One striking pattern in the data is the apparent divide along political lines. Of the ten most crypto-interested states, seven voted for the Democratic candidate in the 2020 US presidential election. Alaska, North Dakota and Florida were the only Republican-leaning states to appear in the upper tier of the ranking.

Speaking on the findings, Zellix’s Chief Financial Officer Trajan King pointed to a wider cultural and technological context that may explain the variation in interest.

“States with a strong technology sector or a history of financial innovation tend to show higher levels of engagement with cryptocurrency,” he said. “It’s also worth noting the political divide. Conservative-leaning states have generally been slower to embrace newer financial technologies, particularly those associated with decentralization and disruption.”

King also suggested the trend could have implications for former President Donald Trump’s foray into digital currency. His $TRUMP token project has faced turbulence in recent months, and it remains to be seen whether Republican-leaning areas will be as receptive to crypto ventures under his name.

What People Are Searching For

The study’s methodology involved compiling a broad set of search terms tied to cryptocurrency, including basic questions such as “how to buy bitcoin”, “crypto for beginners”, and “can I invest in crypto”, alongside terms connected to popular platforms like Binance, Kraken and Coinbase.

By focusing on search volume per capita, the analysis was designed to highlight interest levels irrespective of state size. The inclusion of both casual and technical queries paints a picture of a public that is still learning, exploring, and (perhaps in some cases) speculating.

From NFTs to Exchanges, New Yorkers Lead the Country’s Crypto Curiosity

Below is a look at the most crypto-curious states, based on average monthly searches per 100,000 people:

State Keyword searches per 100K
New York 298.58
Nevada 257.63
California 252.43
Hawaii 244.69
Alaska 235.09
Florida 232.85
North Dakota 228.44
Washington 222.86
New Jersey 220.84
Massachusetts 209.78
Colorado 203.75
Oregon 202.44
Rhode Island 201.12
Connecticut 200.39
Virginia 200.15
Georgia 196.12
Texas 193.68
Illinois 192.24
Arizona 191.77
New Hampshire 191.39
Maryland 190.43
Maine 188.25
Vermont 187.91
Delaware 186.23
Wyoming 184.91
Minnesota 183.79
Kansas 177.77
Utah 175.8
North Carolina 174.13
South Dakota 170.05
Michigan 161.36
Nebraska 160.74
Pennsylvania 160.69
Montana 157.94
Idaho 157.91
South Carolina 155.78
New Mexico 154.93
Ohio 152
Missouri 150.78
Tennessee 150.27
Oklahoma 142.93
Arkansas 141.91
Alabama 140.88
Wisconsin 139.39
Iowa 137.77
Indiana 134.89
Louisiana 134.74
West Virginia 133.75
Kentucky 132.67
Mississippi 124.05

Methodology at a Glance

Zellix began by building a seed list of common search phrases linked to cryptocurrency, which included everything from investing advice to questions about crypto exchanges and NFT platforms. Using Google Keyword Planner, the team gathered monthly search data for each term across all US states. These figures were then adjusted by population to produce a final ranking based on searches per 100,000 people.

Read next: Where in the World Are LinkedIn Users Most Likely to Call Themselves CEOs?
by Irfan Ahmad via Digital Information World

Tuesday, June 10, 2025

YouTube’s Expanding Creator Network Fuels a Booming Digital Economy in the U.S.

The influence of YouTube’s creator economy continues to expand, with new figures showing a sharp rise in both job creation and economic impact across the United States.

In a report released on Tuesday, data from Oxford Economics reveals that YouTube's broader creative ecosystem was responsible for generating more than 490,000 full-time jobs and contributing over 55 billion dollars to the US economy in the past year alone.

This economic network includes far more than just the individuals producing content. It draws in a range of professionals such as editors, production assistants and media strategists, alongside third-party companies that offer tools or services tailored to creators, including platforms like Patreon and Linktree. Together, they form a growing infrastructure that supports online content as a serious business sector, not just a form of personal expression.

What makes these figures especially notable is their rise during a period when investment into the creator economy has slowed. Venture capital firms, once enthusiastic backers of creator-focused startups, have become more cautious since their peak interest around four years ago. Yet despite that shift, YouTube’s ecosystem has expanded significantly. Just two years ago, similar research placed its economic contribution at 35 billion dollars with around 390,000 jobs supported, meaning the industry has added roughly 100,000 jobs and 20 billion dollars in new economic value since then.

Part of this momentum stems from YouTube’s ability to offer creators a more dependable income than most competing platforms. Those accepted into its Partner Program are eligible to receive a majority of the advertising revenue their videos generate. While mega-creators with enormous audiences earn headlines, many smaller creators also make thousands of dollars each month through steady video publishing, helped by YouTube’s more developed ad infrastructure. Short-form video platforms, by contrast, have struggled to find consistent ways to distribute advertising revenue, which leaves many creators there with fewer reliable earnings.

However, the financial side of the creator industry has not yet caught up with its cultural visibility. Many creators still face practical obstacles when navigating traditional business systems. Some encounter difficulty applying for loans or securing business credit, even when their income is stable and their audience reach is significant. In some cases, institutions lack the frameworks needed to understand how a digital career fits into existing models of self-employment or entrepreneurship.

With these concerns gaining visibility, lawmakers have begun to take notice. Last week, two members of the US House of Representatives announced the formation of a bipartisan caucus dedicated to the creator economy. The group aims to explore how policy can better reflect the realities of this fast-growing sector and support those working within it more effectively.

Image: DIW-Aigen

Read next: OpenAI's CEO Reveals Tiny Energy Footprint for ChatGPT Amid Rising Global Concerns Over AI Power Demands
by Irfan Ahmad via Digital Information World

OpenAI's CEO Reveals Tiny Energy Footprint for ChatGPT Amid Rising Global Concerns Over AI Power Demands

OpenAI's CEO says a typical ChatGPT request now uses around 0.34 watt-hours of energy and just 0.000085 gallons of water. That’s roughly what an oven consumes in a second, or a high-efficiency lightbulb burns in a couple of minutes. The water use, by comparison, equals about one-fifteenth of a teaspoon.

The numbers were shared in a blog post from Sam Altman, who outlined a vision in which artificial intelligence becomes cheap, powerful and widespread. He suggested that over time, the cost of using these systems could fall to match the cost of electricity alone.

Energy use in AI has become a growing concern. Recent research warned that by the end of this year, AI could consume more power than Bitcoin mining. Another study last year found that generating a single 100-word email through GPT-4 could require more than one bottle of water, depending on where the data center is located.

OpenAI’s estimates suggest improvements in efficiency, but they reflect average usage and don’t account for peak demand or training costs. Tools like ChatGPT are now used by hundreds of millions of people, and even small changes in performance or cost can scale quickly.

The blog also outlined a future timeline. In 2025, AI agents are expected to take on deeper cognitive work like writing code. In 2026, systems may begin offering new insights rather than just processing known information. By 2027, OpenAI believes robots could handle real-world tasks.

The post argues that AI systems are starting to improve their own development process. Faster tools help researchers discover better models and algorithms. These tools are now being used to accelerate the design of the next generation of AI, in what the company describes as a kind of early self-improvement loop.

As the technology spreads, OpenAI warns that both benefits and risks will grow. A minor error could have a wide effect when amplified across millions of users. But the company believes that with strong governance, abundant intelligence and cheap energy, it’s possible to achieve rapid scientific progress and quality-of-life improvements, while keeping systems aligned with human goals.

OpenAI says the future might not feel like a sharp break, but rather a steady shift. Looking back, though, the changes could seem vast.

Image: DIW-Aigen

Read next: 

• ChatGPT Drew 177.42 Million Daily Visitors in May 2025, Reflecting Massive Global Engagement

• Meta's Threads to Pilot Native Messaging Feature, Starting in Asia and South America
by Irfan Ahmad via Digital Information World

Meta's Threads to Pilot Native Messaging Feature, Starting in Asia and South America

Meta is preparing to roll out a long-anticipated messaging function within its Threads platform, removing the current reliance on Instagram for private conversations. This week, the company begins testing the internal direct messaging capability, initially in Hong Kong, Thailand, and Argentina, before scaling the feature to additional markets.

Instead of routing messages through Instagram’s infrastructure, Threads will handle them natively using a standalone inbox. The change marks a shift away from the previous approach, where Threads activity was tightly coupled with Instagram accounts, everything from sign-up credentials to the automatic syncing of follower lists depended on the older platform.

Access to the new inbox comes through a redesigned interface. On mobile devices, both iOS and Android, users can tap a newly added envelope icon situated along the bottom navigation bar. Desktop users will find the messaging symbol placed on the left-hand panel. Meta has yet to provide a specific timeline for a broader rollout beyond the initial test regions.


The lack of private communication tools has been a persistent complaint from the Threads user base. Until now, those wishing to engage in one-on-one exchanges had to switch apps and initiate chats on Instagram. Introducing a dedicated messaging layer on Threads not only streamlines communication but also caters to creators and business accounts that often juggle engagement across multiple platforms.

The move signals Meta’s intent to refine Threads into a fuller social media ecosystem, one that could eventually rival competitors like X (formerly Twitter), which has long provided integrated messaging as a core component.

Read next:

TikTok and ChatGPT Took the Lead in Mobile App Revenue as Spending Jumped in May
by Irfan Ahmad via Digital Information World