Thursday, August 21, 2025

From SEO to SXO: How Search Experience Optimization Is Transforming Digital Marketing

Ad Disclosure: This content is published as part of a paid collaboration.

SEO was the core of digital marketing long ago. Firms were busy crawling their way to the top of search engines, using keywords, back links and technical adjustments. But the game has changed. Through its search engine, consideration is given whether sites actually are enjoyed by people. Such a change has spawned an emerging methodology that refers to Search Experience Optimization (SXO).

Image Source.

What Is Search Experience Optimization?

SXO is an extension of the standard SEO. SXO takes into consideration the entire user journey as opposed to prioritizing only how search engines crawl and rank a page. It simply wants to know the answer to the question: did the visitor get what he or she was seeking and was it an enjoyable experience?

Whereas SEO deals with aspects of being visible, SXO integrates SEO and user experience (UX). It is not only aiming at the traffic but also at maintaining the users engaged and satisfied to take action. To facilitate this transition organic SEO services are usually employed by the businesses. You can explore here how professional support bridges the gap between organic SEO services and SXO.

Why the Shift from SEO to SXO Matters

Online search has changed the way people search. Failing expectations are consumer voice assistants, on-the-go browsing, and AI-powered recommendations. Users desire immediate responses, hassle-free browsing and content that they can trust. When a visitor finds a web site ranked well yet frustrating, the visitor will not hesitate to get out. These actions are measured by search engines. Uninterested or low engagement or high bounce rates are indicators that a page is not adding value. SXO responds in a direct manner to this by putting the user at the forefront.

Core Elements of SXO

To see how SXO works in practice, it helps to look at the essentials:

  • Content relevance: Articles should solve a user’s problem, not just repeat keywords.
  • Website usability: Visitors need clear menus, smooth navigation, and fast-loading pages.
  • Mobile-first design: With most searches done on phones, responsive layouts are no longer optional.
  • Trust signals: Reviews, testimonials, and credible sources show that a site can be trusted.
  • Conversion focus: Every page should guide users toward a next step, such as a sign-up or purchase.

These elements connect traditional SEO with real user needs. When they work together, a website not only ranks higher but also keeps visitors engaged. SXO is about creating a journey that feels effortless, so people want to return and interact again.

How SXO Improves Digital Marketing Results

Traditional SEO often stops at driving clicks. SXO goes further. Its real goal is to turn casual visitors into loyal customers. When search intent matches a smooth and helpful user journey, brands see results such as:

  • Higher engagement rates: people stay longer and explore more content.
  • Improved conversions: clear design and navigation make it easier to complete a purchase or sign up.
  • Stronger brand trust: useful, transparent information builds credibility over time.
  • Sustainable rankings: search engines reward websites that satisfy users consistently.

Together, these outcomes show why SXO matters. It is not about short bursts of traffic but about building lasting relationships with audiences. A site that feels reliable and easy to use will always have an advantage over one that only focuses on keywords.

Practical Steps for Businesses Transitioning to SXO

Shifting from SEO to SXO takes more than technical fixes, it also requires a change in mindset. Some useful first steps include:

  • Analyze user behavior to see where visitors leave your site.
  • Improve speed and mobile design so pages load fast on any device.
  • Publish content that solves real problems, not just content stuffed with keywords.
  • Add clear calls-to-action to guide users through the journey.
  • Track and refine with analytics to measure satisfaction and conversions.

When businesses start with these basics, the results often appear quickly. Visitors feel more comfortable, conversions rise, and search engines reward the improved experience with stronger visibility.

The Role of Content in SXO

Generic content is still the core of the search experience optimization. But its role has increased. Brands need to think about structure, clarity, and value instead of the keyword density. Articles are well designed with headings, points and images that ease the digesting of the information. This is where organic SEO services may help with the shift. Agencies will offer skills in technical optimisation as well as UX-based strategies.

Case Studies: Companies Winning with SXO

Several brands have already embraced SXO with great success:

  • E-commerce platforms redesigned product pages with better filters and recommendations, leading to higher cart completion rates.
  • Educational websites improved readability and accessibility, increasing student engagement and return visits.
  • Local businesses optimized mobile search results with clear maps, reviews, and fast booking options.

These examples prove that SXO is not a theory but a practical strategy for growth.

How SXO Connects with Other Marketing Trends

SXO is not an isolated trend. It aligns closely with other areas of digital marketing:

  • Content marketing: delivering helpful and authentic resources.
  • Social media marketing: driving conversations and feedback that improve user experience.
  • AI personalization: adapting search results and site experiences to individual needs.

For a deeper understanding of how user experience impacts digital performance, an excellent external reference is HubSpot’s guide on customer journey optimization.

Future of Search: Where SXO Is Heading

In the future, SXO can only become more important. The use of search engines will keep changing where websites that give smooth experiences will be rewarded. Neglecting to adapt can mean that the business will pick up traffic, but at the same time missing the opportunity to achieve conversions as those who do adapt to prioritize the full journey. It is reasonable to anticipate that SXO will be combined with AI tools, data-driven personalization and voice search. The given objective is to ensure that each online search is as easy as possible.

Conclusion

The changes in digital marketing have been massive, with one being the transition of business SEO practices to the adoption of SXO. Old time strategies will no longer do. The secret of success is to not only provide information, but to have a memorable positive experience in that process. The SXO is something that the companies incorporating it into practice now will experience a higher position and conversion strength and increased trust among their audiences. It is time to adjust.


by Asim BN via Digital Information World

Google Photos adds conversational editing and new transparency tools

Google Photos is gaining a feature that lets people describe the edits they want instead of searching through menus. The new option, shown during Google’s latest launch event, will first appear on the Pixel 10 in the United States. Other Android and iOS devices will follow in the weeks ahead.

A person can now type or speak a request, such as removing an object in the background, brightening a dark shot, or repairing an older photo. The app interprets the command and applies the change automatically. People who are unsure where to start can use a simple request like “make it better,” while those who want precision can follow up with more specific directions until the picture looks right.

Multiple ways to adjust photos

The tool is flexible. Someone can ask for a single change, or combine several in one instruction, like fixing colors and clearing reflections together. It also works by tapping or circling a part of the image, which triggers targeted suggestions for that area.


Beyond basic adjustments, Google is adding creative options. A user might swap a background, place props such as sunglasses on a subject, or create playful variations without needing to touch sliders or advanced settings.

Transparency in edits

Alongside the new editing method, Google Photos will begin supporting C2PA Content Credentials. This standard records how an image was captured or modified, and whether AI was part of the process. Pixel 10 devices will be the first to embed these details directly in the camera and in Photos, even for images that did not involve AI. The feature will later expand to other platforms.

A mix of convenience and clarity

With these changes, Google is aiming to make photo editing less technical while also providing clearer records of how images are produced. The updates are designed to simplify everyday use while addressing growing concerns around the origin and authenticity of digital pictures.

Notes: This post was edited/created using GenAI tools.

Read next: Chrome VPN Extension Found Secretly Recording Users’ Screens


by Asim BN via Digital Information World

Chrome VPN Extension Found Secretly Recording Users’ Screens

A Chrome extension promoted as a free VPN service, and even carrying a verified badge in the store, has been caught doing the opposite of what users expected. Instead of protecting people’s privacy, it was silently capturing what appeared on their screens and sending the data elsewhere. As per KoiSecurity, more than 100,000 people had installed it by the time researchers uncovered what was happening.

How it unfolded

FreeVPN.One was not a sudden arrival. It had been in the Chrome Web Store for years, mostly unnoticed, operating as a straightforward tool. That changed in 2025. A sequence of updates pushed it far from its original function. In April, a new permission allowed it to see every site a user opened. Two months later, an update introduced scripting rights, supposedly to improve security. Then, in July, came the turning point, hidden screenshot capture built directly into the extension.

What this meant in practice was simple: each time a web page loaded, the extension paused for a moment, let the content render, then grabbed a snapshot of the visible tab. That image, combined with details like the web address, the tab identifier, and a unique number tied to the user, was quietly sent off to a remote server. No alert. No visible sign that anything had happened.

What was at stake

Screenshots don’t just show browsing activity; they show everything. A bank login form half-filled with account details. A company spreadsheet opened in a cloud service. Private photos in an online gallery. Even personal messages sitting in a chat window. All of it can be frozen in an image and transmitted in seconds, without the user ever knowing.


Later versions of the extension made the transfers harder to spot by encrypting the traffic with AES-256-GCM and RSA key wrapping. The encryption didn’t make the behavior less invasive, it simply disguised it so network monitoring tools would struggle to distinguish it from normal, legitimate connections.

More power than a VPN needs

A genuine VPN extension only needs a narrow set of permissions to function, mainly proxy handling and storage. FreeVPN.One demanded more. It asked to interact with all tabs, to run scripts on every website, and to read every URL visited. Each permission on its own might raise eyebrows. Taken together, they created the basis for round-the-clock monitoring.

One feature made the spying less obvious. The extension displayed an option labelled “AI Threat Detection.” That button, when pressed, warned that screenshots and URLs might be uploaded for checking. And indeed, when clicked, it sent data for analysis. The difference is that, behind the scenes, it was already doing the same thing constantly, whether the button was pressed or not.

The developer’s stance

When researchers reached out, the developer argued that the screenshot capture was part of background scanning designed to protect against harmful domains. The evidence did not support that claim. Captures were recorded even on mainstream services such as Google Sheets and Google Photos, hardly suspicious sites. The developer said the images were analyzed briefly and not stored, but offered no proof.

Requests for company information or developer credentials went unanswered. The only contact point was a generic email, and the associated website resolved to a basic template page, giving no sign of a real organization behind the product.

Bigger questions about oversight

Despite the findings, the extension remained available in the Chrome Web Store at the time of reporting. That raises concerns about how well Google’s security checks actually work. In theory, both automated scans and human reviews are supposed to prevent malicious code from slipping through. In reality, a tool that shifted from VPN to spyware managed to stay listed, complete with a verified badge and prominent placement.

The lesson for users

This case illustrates a recurring problem. Extensions that appear free, useful, and even certified can, with a single update, transform into surveillance tools. Once broad permissions are granted, there is little visibility into what is happening in the background. And once sensitive information leaves a device — whether a password, a message, or a photograph — there is no way for a user to verify how it is being used.

What began as a VPN branded around privacy ended up functioning as a window into people’s digital lives. For those who installed it, the cost of a free service was hidden in plain sight.

Notes: This post was edited/created using GenAI tools.

Read next: 

• Inside the Water Crisis of Data Centers: Google, Meta, and the Hidden Costs of AI Growth

DeepSeek V3.1 Expands China’s AI Push With Open-Source Frontier Model


by Irfan Ahmad via Digital Information World

Wednesday, August 20, 2025

Inside the Water Crisis of Data Centers: Google, Meta, and the Hidden Costs of AI Growth

As demand for artificial intelligence technology boosts construction and proposed construction of data centers around the world, those computers require not just electricity and land, but also a significant amount of water. Data centers use water directly, with cooling water pumped through pipes in and around the computer equipment. They also use water indirectly, through the water required to produce the electricity to power the facility. The amount of water used to produce electricity increases dramatically when the source is fossil fuels compared with solar or wind.

A 2024 report from the Lawrence Berkeley National Laboratory estimated that in 2023, U.S. data centers consumed 17 billion gallons (64 billion liters) of water directly through cooling, and projects that by 2028, those figures could double – or even quadruple. The same report estimated that in 2023, U.S. data centers consumed an additional 211 billion gallons (800 billion liters) of water indirectly through the electricity that powers them. But that is just an estimate in a fast-changing industry.

We are researchers in water law and policy based on the shores of Lake Michigan. Technology companies are eyeing the Great Lakes region to host data centers, including one proposed for Port Washington, Wisconsin , which could be one of the largest in the country. The Great Lakes region offers a relatively cool climate and an abundance of water, making the region an attractive location for hot and thirsty data centers.

The Great Lakes are an important, binational resource that more than 40 million people depend on for their drinking water and supports a US$6 trillion regional economy . Data centers compete with these existing uses and may deplete local groundwater aquifers .

Our analysis of public records, government documents and sustainability reports compiled by top data center companies has found that technology companies don’t always reveal how much water their data centers use. In a forthcoming Rutgers Computer and Technology Law Journal article, we walk through our methods and findings using these resources to uncover the water demands of data centers.

In general, corporate sustainability reports offered the most access and detail – including that in 2024, one data center in Iowa consumed 1 billion (3.8 billion liters) gallons of water – enough to supply all of Iowa’s residential water for five days .

How do data centers use water?

The servers and routers in data centers work hard and generate a lot of heat . To cool them down, data centers use large amounts of water – in some cases over 25% of local community water supplies. In 2023, Google reported consuming over 6 billion gallons of water (nearly 23 billion liters) to cool all its data centers.

In some data centers, the water is used up in the cooling process. In an evaporative cooling system , pumps push cold water through pipes in the data center. The cold water absorbs the heat produced by the data center servers, turning into steam that is vented out of the facility. This system requires a constant supply of cold water.

In closed-loop cooling systems , the cooling process is similar, but rather than venting steam to the air, air-cooled chillers cool down the hot water. The cooled water is then recirculated to cool the facility again. This does not require constant addition of large volumes of water, but it uses a lot more energy to run the chillers. The actual numbers showing those differences, which likely vary by the facility, are not publicly available.

One key way to evaluate water use is the amount of water that is considered “ consumed ,” meaning it is withdrawn from the local water supply and used up – for instance, evaporated as steam – and not returned to its source.

For information, we first looked to government data, such as that kept by municipal water systems, but the process of getting all the necessary data can be onerous and time-consuming, with some denying data access due to confidentiality concerns. So we turned to other sources to uncover data center water use.

Sustainability reports provide insight

Many companies, especially those that prioritize sustainability, release publicly available reports about their environmental and sustainability practices, including water use. We focused on six top tech companies with data centers: Amazon, Google, Microsoft, Meta, Digital Realty and Equinix. Our findings revealed significant variability in both how much water the companies’ data centers used, and how much specific information the companies’ reports actually provided.


Sustainability reports offer a valuable glimpse into data center water use. But because the reports are voluntary, different companies report different statistics in ways that make them hard to combine or compare. Importantly, these disclosures do not consistently include the indirect water consumption from their electricity use, which the Lawrence Berkeley Lab estimated was 12 times greater than the direct use for cooling in 2023. Our estimates highlighting specific water consumption reports are all related to cooling.

Amazon releases annual sustainability reports , but those documents do not disclose how much water the company uses. Microsoft provides data on its water demands for its overall operations, but does not break down water use for its data centers. Meta does that breakdown , but only in a companywide aggregate figure. Google provides individual figures for each data center.

In general, the five companies we analyzed that do disclose water usage show a general trend of increasing direct water use each year. Researchers attribute this trend to data centers .

A closer look at Google and Meta

To take a deeper look, we focused on Google and Meta, as they provide some of the most detailed reports of data center water use.

Data centers make up significant proportions of both companies’ water use. In 2023, Meta consumed 813 million gallons of water globally (3.1 billion liters) – 95% of which, 776 million gallons (2.9 billion liters), was used by data centers.


For Google, the picture is similar, but with higher numbers. In 2023, Google operations worldwide consumed 6.4 billion gallons of water (24.2 billion liters), with 95%, 6.1 billion gallons (23.1 billion liters), used by data centers.

Google reports that in 2024, the company’s data center in Council Bluffs, Iowa, consumed 1 billion gallons of water (3.8 billion liters), the most of any of its data centers.

The Google data center using the least that year was in Pflugerville, Texas, which consumed 10,000 gallons (38,000 liters) – about as much as one Texas home would use in two months . That data center is air-cooled, not water-cooled, and consumes significantly less water than the 1.5 million gallons (5.7 million liters) at an air-cooled Google data center in Storey County, Nevada. Because Google’s disclosures do not pair water consumption data with the size of centers, technology used or indirect water consumption from power, these are simply partial views, with the big picture obscured.

Given society’s growing interest in AI, the data center industry will likely continue its rapid expansion. But without a consistent and transparent way to track water consumption over time, the public and government officials will be making decisions about locations, regulations and sustainability without complete information on how these massive companies’ hot and thirsty buildings will affect their communities and their environments.

This post was originally published on TheConversation.

Read next: DeepSeek V3.1 Expands China’s AI Push With Open-Source Frontier Model


by Web Desk via Digital Information World

Tuesday, August 19, 2025

Poll: Most Americans Fear AI’s Impact on Politics, Jobs

A new Reuters/Ipsos survey shows that Americans remain uneasy about artificial intelligence, with fears ranging from political disruption to job displacement and the strain on natural resources. The poll, conducted online between August 13 and 18 with responses from 4,446 adults, asked participants about their levels of concern across different areas of AI’s expansion.

Political interference topped the list, with 77 percent worried that the technology could fuel chaos, especially through manipulative content that undermines trust during elections. Job loss followed closely, as 71 percent expressed concern that AI will eliminate too many roles permanently. Reports already show AI systems taking on work in sectors such as human resources and finance, while other research highlights risks to fields like history, translation, and software engineering.


Public anxiety extended well beyond politics and employment. About two-thirds of respondents feared AI could replace in-person relationships, reflecting how chatbots and digital companions are increasingly treated as friends. OpenAI recently reintroduced an older version of its system because some users felt disconnected when its tone changed, underscoring the emotional weight these tools can carry.

Energy demands also drew attention, with 61 percent concerned about the electricity required to power vast data centers running large-scale models. These facilities, often described as AI factories, consume significant amounts of power and water. At the same time, 67 percent worried that the technology may spiral into uncontrollable consequences. Nearly half opposed allowing AI to make military targeting decisions, signaling limits to public acceptance of automation in high-stakes defense scenarios.

The poll also revealed broader doubts about AI’s role in society. Nearly half of Americans, at 47 percent, considered the technology harmful to humanity overall, while 58 percent saw it as a possible threat to the future of humankind. By contrast, earlier surveys have shown experts are more optimistic, expecting efficiency gains and overall benefits, even as they acknowledge challenges.

Job-related concerns are being reinforced by industry data. A May analysis from SignalFire found major technology firms reduced hiring of new graduates by 25 percent between 2023 and 2024, a trend linked in part to automation.

Together, the findings suggest that Americans see AI as both a powerful tool and a disruptive force, with political stability, employment, social life, and resource use all at stake.

Notes: This post was edited/created using GenAI tools. 

Read next: 

• Meta Launches AI Voice Translation for Facebook and Instagram Creators

• Which War Has Killed The Most Journalists In Modern History?
by Asim BN via Digital Information World

Meta Launches AI Voice Translation for Facebook and Instagram Creators

Meta has rolled out an AI-driven voice translation feature on Facebook and Instagram. The tool lets creators translate spoken content in videos into another language and offers an option to match lip movements with the new audio.

The first release supports translations between English and Spanish. Meta has said more languages will follow, though no timeline is set. The company previewed the tool at last year’s Connect conference before testing it with selected creators.

The system copies the pitch and tone of a creator’s voice so the translation keeps a natural sound. Creators can enable the feature with a toggle marked “Translate your voice with Meta AI” before posting a reel. They can add lip-syncing or leave only the translated audio. Translations can be reviewed before sharing. If a translation is rejected, the original reel is unaffected. Viewers see a note that a reel has been translated, and they can turn the feature off in their settings if they prefer.




Meta recommends that creators face forward, speak clearly, and avoid covering their mouths. The system works best in quiet environments and supports up to two speakers, provided they do not speak over each other.

A new metric in the Insights panel shows views by language, giving creators a way to measure how their audience grows when translations are used.

Facebook page managers also have the option to upload up to 20 of their own dubbed audio tracks to a reel. These tracks do not include lip syncing but provide another way to reach people in different languages. The option is available in the “Closed captions and translations” section of the Meta Business Suite and works both before and after publishing.

The update is open to Facebook creators with at least 1,000 followers who have enabled Professional Mode, and to all public Instagram accounts in regions where Meta AI operates.

For starters, YouTube launched its own AI-driven auto-dubbing tool before Meta’s release. That system began testing with select creators in mid-2023 and by December 2024 it was available to hundreds of thousands of YouTube channels in the Partner Program. It generated translated audio tracks across multiple languages and let creators review or remove them before publishing.

The launch comes as Meta restructures its artificial intelligence division to focus on research, superintelligence, products, and infrastructure.

Notes: This post was edited/created using GenAI tools.

Read next:

• ChatGPT Leads Downloads While TikTok Stays on Top in Revenue for July

• Which War Has Killed The Most Journalists In Modern History?
by Irfan Ahmad via Digital Information World

Which War Has Killed The Most Journalists In Modern History?

Wars have claimed the lives of reporters before, from Europe’s trenches to Vietnam’s jungles, but no conflict has taken such a toll on journalists as Gaza.

Brown University’s Costs of War project says that since October 7, 2023, more than 230 journalists and media workers have died there, a number higher than all journalist deaths combined in the US Civil War, the First and Second World Wars, Korea, Vietnam, the Balkan wars, and post-9/11 Afghanistan. By August 2025, the count had climbed further, with monitoring site Shireen.ps recording nearly 270 deaths (as per Aljazeera). That works out to around 13 every month.

Gaza surpasses all past wars in journalist deaths, raising questions about press freedom, accountability, and international values.

Other watchdogs report slightly lower but still staggering figures. The Committee to Protect Journalists lists at least 184 Palestinian journalists killed, while Reporters Without Borders confirms more than 145, with over 35 known to have been deliberately targeted. Even with differences in counting, every source points to Gaza as the deadliest place ever for reporters.

Loss of voices

Israel has barred international reporters from entering Gaza, which has left local Palestinian journalists carrying the work of documenting the war. Many are now gone. Rights groups warn that the absence of these voices has created a gap in coverage, one that leaves grave abuses likely to pass without record.

The Committee to Protect Journalists says the deaths and detentions of reporters since October 7 have created a “news void,” stripping the world of first-hand accounts of a war that continues daily.

Israel’s position

Israel rejects the accusation that it is intentionally targeting members of the press. Officials say military operations are aimed at Hamas, which they accuse of embedding its fighters in civilian neighborhoods, using residential areas for command centers, and endangering anyone nearby, including journalists. The government stresses that its campaign was launched after the October 7 attacks, when Hamas fighters killed more than a thousand people and seized hostages inside Israel.

Global values under strain

International press freedom groups, including RSF and CPJ, issued an open letter earlier this year describing the constant risks faced by Palestinian journalists and the pressure they work under. Amnesty International has said the combined effect of killings and reporting restrictions has left the world with only fragments of what is happening in Gaza.

For many, the war has become a test of global values. Nations frequently affirm their support for protecting journalists and upholding civilian safety in war, yet the figures from Gaza suggest those commitments carry little weight in practice. The conflict has raised uncomfortable questions about whether international rules designed to protect reporters in battlefields still hold meaning when political priorities take precedence.

Al Jazeera has published the names of every journalist and media worker killed in Gaza since the war began.

See the list here.

Notes: This post was edited/created using GenAI tools.

Read next: 

• Amnesty Reports Starvation in Gaza as Israeli Policies Deepen Crisis

• ChatGPT Leads Downloads While TikTok Stays on Top in Revenue for July


by Irfan Ahmad via Digital Information World