Possible effects on cognitive function of using AI.

Wayne

Senior Member
Messages
4,853
Location
Ashland, Oregon
1.75billion liters, oh and this is potable water too

Hi @bad1080 -- Just to mention, I've long believed we should be good "stewards" of the earth, and have all my life advocated for strong environmental protections, voting for politicians who felt the same way. So I want to say, I highly respect your concern for the environmental impacts of these data centers, which I have as well.

Unfortunately (getting back to politicians), Texas is a prime location for data centers because of their very loose regulations concerning water usage. Intensive water usage is key to being the cheapest way to cool those data centers, so that's where they're built (perhaps for other reasons as well).

The thing is, there are much better ways to cool these centers. But, you guessed it, it costs more money. In fact, air cooled centers use virtually no water (though they use a lot of electricity and make a lot of noise--both not good). There are "hybrid" options which are better.

But the best technology (and most expensive--but not by that much) is using immersion technology. I just learned that If Texas required immersion (near-zero water usage) for these new centers, it would cut hundreds of billions of gallons from the 2030 water projection (in the article) and dramatically reduce community noise impacts. Without such rules, most operators will keep choosing the cheaper, more water-intensive path.

My own perspective is that it's the state of TX primarily responsible for all this projected water usage. They could stop it any time they wanted. But it's highly unlikely they will, so the dye seems to be cast.

BTW, between the massive amounts of water used for hydraulic fracking and agriculture in TX (and now the data centers), they're rapidly depleting the treasure of underground water they were blessed with. At current rates, it will be depleted within a few decades. Some areas are already hugely affected.

Hard to believe that in a matter of about a century or so, water resources built up over eons of time will be totally squandered, just because politicians lacked the foresight (and/or courage) to protect it. Also, BTW, states on the northern boundaries of the Ogallala aquifer (like Nebraska) do much better at protecting their underground water resources than Texas.
 
Last edited by a moderator:

southwestforests

Senior Member
Messages
1,561
Location
Missouri

I'm going to pass judgement on that and call it downright immoral.

This from linked article is a point of interest,

According to the Chronicle article, a white paper submitted to the Texas Water Development Board projected that data centers in the state will consume 49 billion gallons of water in 2025. That number is expected to rise to 399 billion gallons by 2030, nearly 7% of the state’s total projected water use.

“People don’t think of data centers as industrial water users, but they are,” said Robert Mace, executive director of The Meadows Center for Water and the Environment at Texas State University.

In the Hill Country region, where several new AI-focused centers are under construction, locals are sounding the alarm. Not only do these facilities demand significant water for evaporative cooling, but much of that water evaporates and cannot be recycled. While some facilities rely on recycled water, many still draw heavily from drinking water supplies.

“Once that water evaporates, it’s just gone,” Mace told The Austin Chronicle.

➡️
and then a search just now found,


Texas Data Centers Use 50 Billion Gallons of Water as State Faces Drought​

Published Aug 01, 2025 at 9:20 AM EDT Updated Aug 04, 2025 at 3:41 AM EDT

https://www.newsweek.com/texas-data-center-water-artificial-intelligence-2107500

The Houston Advanced Research Center (HARC), in a report obtained by the San Antonio Express-News estimated that Texas data centers would use 49 billion gallons of water in 2025, with consumption projected to soar to 399 billion gallons annually by 2030—representing almost 6.6 percent of the state's total water usage.

While midsize data centers typically use 300,000 gallons daily, comparable to consumption by 1,000 households, large-scale facilities such as those recently built or planned in Texas can consume as much as 4.5 million gallons daily.

Unlike electricity, where Senate Bill 6 granted the Electric Reliability Council of Texas authority to cut power to data centers and other heavy users during emergencies, no analogous state law exists to regulate their water use.

Most data centers rely on evaporative cooling, which consumes large volumes of water and results in significant waste lost to evaporation.
 

southwestforests

Senior Member
Messages
1,561
Location
Missouri
Just found this,

Texas Is Still in Drought, and AI Data Centers Are Quietly Guzzling Up Water​


Texas continues to attract AI giants, despite resource strain​


By Sammie Seamon, Fri., July 25, 2025​


https://www.austinchronicle.com/new...i-data-centers-are-quietly-guzzling-up-water/

Nonetheless, no equivalent bill was passed this session to regulate data centers’ water use in Texas.

“Water lags energy, in how we address concerns,” said Margaret Cook, vice president of Water and Community Resilience at the Houston Advanced Research Center. “There are policies that have caught on for large [energy] loads that we don’t have on the water side.”

The average, midsized data center uses 300,000 gallons of water a day, roughly the use of a thousand homes. Larger data centers might use 4.5 million gallons a day, depending on their type of water cooling system. Austin has 47 such data centers, while the Dallas-Fort Worth area hosts the majority in Texas at 189.

It’s been difficult for HARC and experts like Robert Mace, executive director of the Meadows Center for Water and the Environment at Texas State University, to extract transparent water usage reports from data centers. “Their use could be horrific relative to local use, or it could be extremely minimal,” Mace said.

In a white paper to be released this month, HARC estimates that data centers in Texas will consume 49 billion gallons of water in 2025. They also project that by 2030, that number could rise up to 399 billion gallons, or 6.6% of total water use in Texas.

Most data centers use an evaporative cooling system, in which the servers’ heat is absorbed by water. The heat is then removed from the water through evaporation, causing the water to be lost as vapor in the air. The cooler water then goes back through the machines, and this loop is regularly topped off with fresh water. After all, evaporation renders the water saltier and unusable after four or five cycles. “Then they dump the water, and it goes down the sewer,” Mace said.

This water loss is significant when, even after the devastating flooding earlier this month, nearly a quarter of the state remains in drought conditions.

The Texas Water Development Board, the governor-appointed board in charge of drafting the State Water Plan every five years, will release their next plan in 2027. But according to Cook, data centers are not being taken into account when calculating projected state water use for the 2027 plan, and thus in calculating how much additional water we need to conserve and produce to have enough for Texans.
 

Viala

Senior Member
Messages
917
Texas Water Development Board projected that data centers in the state will consume 49 billion gallons of water in 2025. That number is expected to rise to 399 billion gallons by 2030, nearly 7% of the state’s total projected water use.

So if they are telling people to shower less at 0,9%, what they will do at 7%?

Oh I know what will happen, water prices will increase, which will be yet another commodity after housing, electricity, heating and food with skyrocketing prices. Things that should be cheap by default because these are basic needs. They were cheap just a few decades ago, now we are so technologically advanced that having kids and buying healthy food is too expensive. AI data centers will be another excuse to increase prices of water, because there is no doubt they will increase it more than they should.

I have an idea, what about politicians dramatically increase water prices for these data centers and put limits on their water usage NOW, to secure water reserves for citizens at current levels. This will stop AI industry from developing too fast and taking away our jobs. If hiring AI will be more expensive than hiring humans, then the problem will be solved. But they won't do that, of course. WE are the ones who will pay for their development.
 

Rufous McKinney

Senior Member
Messages
14,778
It is pretty much inevitable
because inevitably, somebody who is profiting will override the commons and destroy it. This is what human nature is. It only takes one, to ruin it for everyone else.

It's why laws about the commons, were created about as far back as human writing goes.
 

Dysfunkion

Senior Member
Messages
662
So if they are telling people to shower less at 0,9%, what they will do at 7%?

Oh I know what will happen, water prices will increase, which will be yet another commodity after housing, electricity, heating and food with skyrocketing prices. Things that should be cheap by default because these are basic needs. They were cheap just a few decades ago, now we are so technologically advanced that having kids and buying healthy food is too expensive. AI data centers will be another excuse to increase prices of water, because there is no doubt they will increase it more than they should.

I have an idea, what about politicians dramatically increase water prices for these data centers and put limits on their water usage NOW, to secure water reserves for citizens at current levels. This will stop AI industry from developing too fast and taking away our jobs. If hiring AI will be more expensive than hiring humans, then the problem will be solved. But they won't do that, of course. WE are the ones who will pay for their development.

We always are the one's paying for what is developed that no one needs making everything worse. Off the top of my head even in my short time of using Chatgpt I can't think of anything it actually improved. At best I was able to process some data faster. Like I said AI is nothing new but widespread, massive resource consuming and world ruining use like now is. Like how I mentioned processing data faster, I was doing some research on health looking for links between various specific things that would have been a day long project without a tool like that. Another time I just wanted it to find some easily accessible solid information you didn't need to dig much deeper than wikipedia to find and put in a chart quick. A perfectly fine personal use but not use that justifies what it's doing in the big picture. I would continue to use it if I had a small program that only does this as needed with that level of ability that stops the second I close it out. Nothing being stored, no massive central data centers consuming a trillion gigs of bandwidth and all the water in the ocean a day, ect-. I open it, it does it's thing, generates megabytes of temporary data, and closes out. But without saying more than I need to, it's become a constantly crawling monster being used against us.

I'm not against AI, I'm against how it's being used. I know some people out there are using it in great ways that doesn't step on anyone's toes. But the bulk of it is generating constant digital noise over millions of stored queries of great pasta recipes, giving you worse search results, generating endless amounts of worthless content in the TB's or more daily justifying more physical resource consumption (not just water but hard drives and computer parts WE need too, driving up prices and creating shortages), harvesting your data, censoring information, creating bogus information, devaluing genuine data, and replacing humans.

Meanwhile all the information on websites like this can be stored on a single hard drive, contain more value than all the data Chatgpt generates in a month using barely a fraction of the resources, and has only real people doing real things. Chatgpt may be able to scan a study and sum up specific data from it but it will never contain the complexity and specificity of experience of actual data from the people using experience and medical information/studies in real world applications. A single page of a topic on here can be more valuable to someone than any amount of feeding data into AI because AI can not have the human experience required for the depth of discussion needed for the information to have that value. And all that information's size? Kilobytes, maybe megabytes if someone posted some pictures. Yeah there's a lot of pretty redundant data here too but it's not "millions of queries on great pasta recipes fed through a system generating more than TB's of worthless data a day" either.
 

southwestforests

Senior Member
Messages
1,561
Location
Missouri

Google’s healthcare AI made up a body part — what happens when doctors don’t notice?​

Google dubbed an error from its Med-Gemini model a typo. Experts say it demonstrates the risks of AI in medicine.

by Hayden Field Aug 4, 2025, 2:00 PM UTC

https://www.theverge.com/health/718049/google-med-gemini-basilar-ganglia-paper-typo-hallucination

Scenario: A radiologist is looking at your brain scan and flags an abnormality in the basal ganglia. It’s an area of the brain that helps you with motor control, learning, and emotional processing. The name sounds a bit like another part of the brain, the basilar artery, which supplies blood to your brainstem — but the radiologist knows not to confuse them. A stroke or abnormality in one is typically treated in a very different way than in the other.

Now imagine your doctor is using an AI model to do the reading. The model says you have a problem with your “basilar ganglia,” conflating the two names into an area of the brain that does not exist. You’d hope your doctor would catch the mistake and double-check the scan. But there’s a chance they don’t.

Though not in a hospital setting, the “basilar ganglia” is a real error that was served up by Google’s healthcare AI model, Med-Gemini. A 2024 research paper introducing Med-Gemini included the hallucination in a section on head CT scans, and nobody at Google caught it, in either that paper or a blog post announcing it. When Bryan Moore, a board-certified neurologist and researcher with expertise in AI, flagged the mistake, he tells The Verge, the company quietly edited the blog post to fix the error with no public acknowledgement — and the paper remained unchanged. Google calls the incident a simple misspelling of “basal ganglia.” Some medical professionals say it’s a dangerous error and an example of the limitations of healthcare AI.
 

bad1080

Senior Member
Messages
540

Google’s healthcare AI made up a body part — what happens when doctors don’t notice?​

Google dubbed an error from its Med-Gemini model a typo. Experts say it demonstrates the risks of AI in medicine.

by Hayden Field Aug 4, 2025, 2:00 PM UTC

https://www.theverge.com/health/718049/google-med-gemini-basilar-ganglia-paper-typo-hallucination
this is going to be so much worse once everybody uses AI to cheat through their education, so they are much less likely to catch an error like that
 

southwestforests

Senior Member
Messages
1,561
Location
Missouri
this is going to be so much worse once everybody uses AI to cheat through their education, so they are much less likely to catch an error like that
Which brings to mind this from the article,

“These things propagate. We found in one of our analyses of a tool that somebody had written a note with an incorrect pathologic assessment — pathology was positive for cancer, they put negative (inadvertently) … But now the AI is reading all those notes and propagating it, and propagating it, and making decisions off that bad data.”
 

Viala

Senior Member
Messages
917
I know some people out there are using it in great ways that doesn't step on anyone's toes. But the bulk of it is generating constant digital noise over millions of stored queries of great pasta recipes, giving you worse search results, generating endless amounts of worthless content in the TB's or more daily justifying more physical resource consumption (not just water but hard drives and computer parts WE need too, driving up prices and creating shortages), harvesting your data, censoring information, creating bogus information, devaluing genuine data, and replacing humans.

It's our human nature to make our lives more convenient. Using AI as a search engine is a disaster though. It will lead to a situation when searches will be bubbled and individualized. If actual search engines disappear this will lead to a dynamic internet which may show everyone something else, same idiocy and dangers as with dynamic prices. They will integrate search engines with AI anyways it's already happening, but we shouldn't help it.

AI will only give more power to intelligence agencies on how to influence people. Like there was this one guy recently who wanted to eat less chloride and AI told him to eat sodium bromide instead, he did and got poisoned, ended up with hallucinations and psychosis. We will never know if what AI tells us is just a program, or if someone behind that AI wants to try something lol. Any emotional attachment should be an instant red flag. I think people who will use it purely intellectually and within reason will be the most resilient to it's side effects. The rest is playing with the devil here. AI girlfriends, AI buddies, AI lovers, AI therapists, AI secret keepers, AI comment writers. It's so successful because it's a pleasant experience and it fills some gaps. The market value is in generating pleasant feelings and data aggregation. Then, mice given an endless supply of chow just won't stop. We won't stop and they won't stop. Then they'll say they can't turn it off cause it's too massive. Having so much fun, staring down a loaded gun.
 

I am sick

Senior Member
Messages
290
We always are the one's paying for what is developed that no one needs making everything worse. Off the top of my head even in my short time of using Chatgpt I can't think of anything it actually improved. At best I was able to process some data faster. Like I said AI is nothing new but widespread, massive resource consuming and world ruining use like now is. Like how I mentioned processing data faster, I was doing some research on health looking for links between various specific things that would have been a day long project without a tool like that. Another time I just wanted it to find some easily accessible solid information you didn't need to dig much deeper than wikipedia to find and put in a chart quick. A perfectly fine personal use but not use that justifies what it's doing in the big picture. I would continue to use it if I had a small program that only does this as needed with that level of ability that stops the second I close it out. Nothing being stored, no massive central data centers consuming a trillion gigs of bandwidth and all the water in the ocean a day, ect-. I open it, it does it's thing, generates megabytes of temporary data, and closes out. But without saying more than I need to, it's become a constantly crawling monster being used against us.

I'm not against AI, I'm against how it's being used. I know some people out there are using it in great ways that doesn't step on anyone's toes. But the bulk of it is generating constant digital noise over millions of stored queries of great pasta recipes, giving you worse search results, generating endless amounts of worthless content in the TB's or more daily justifying more physical resource consumption (not just water but hard drives and computer parts WE need too, driving up prices and creating shortages), harvesting your data, censoring information, creating bogus information, devaluing genuine data, and replacing humans.

Meanwhile all the information on websites like this can be stored on a single hard drive, contain more value than all the data Chatgpt generates in a month using barely a fraction of the resources, and has only real people doing real things. Chatgpt may be able to scan a study and sum up specific data from it but it will never contain the complexity and specificity of experience of actual data from the people using experience and medical information/studies in real world applications. A single page of a topic on here can be more valuable to someone than any amount of feeding data into AI because AI can not have the human experience required for the depth of discussion needed for the information to have that value. And all that information's size? Kilobytes, maybe megabytes if someone posted some pictures. Yeah there's a lot of pretty redundant data here too but it's not "millions of queries on great pasta recipes fed through a system generating more than TB's of worthless data a day" either.

Just found this,

Texas Is Still in Drought, and AI Data Centers Are Quietly Guzzling Up Water​


Texas continues to attract AI giants, despite resource strain​


By Sammie Seamon, Fri., July 25, 2025​


https://www.austinchronicle.com/new...i-data-centers-are-quietly-guzzling-up-water/
Hi
I would have expected them to be using a closed loop chiller system for cooling , that way the right chemicals could be used to treat the water to prevent scaling deposits, etc.
Or use Glycerin or a mix.
That would save them money long term , so they dont have to replace piping. And better efficentcy.
Yes they would occasionally have to add water because of evaporation.
That is how I would have designed it for them.
It sounds like they must be using a cooling tower instead they are way inefficient, cheaper too if your water is free!
And it can be treated .
What a waste of water with using a cooling tower.
I am going to research that , to see why they chose that approach.
 

Dysfunkion

Senior Member
Messages
662
It's our human nature to make our lives more convenient. Using AI as a search engine is a disaster though. It will lead to a situation when searches will be bubbled and individualized. If actual search engines disappear this will lead to a dynamic internet which may show everyone something else, same idiocy and dangers as with dynamic prices. They will integrate search engines with AI anyways it's already happening, but we shouldn't help it.

AI will only give more power to intelligence agencies on how to influence people. Like there was this one guy recently who wanted to eat less chloride and AI told him to eat sodium bromide instead, he did and got poisoned, ended up with hallucinations and psychosis. We will never know if what AI tells us is just a program, or if someone behind that AI wants to try something lol. Any emotional attachment should be an instant red flag. I think people who will use it purely intellectually and within reason will be the most resilient to it's side effects. The rest is playing with the devil here. AI girlfriends, AI buddies, AI lovers, AI therapists, AI secret keepers, AI comment writers. It's so successful because it's a pleasant experience and it fills some gaps. The market value is in generating pleasant feelings and data aggregation. Then, mice given an endless supply of chow just won't stop. We won't stop and they won't stop. Then they'll say they can't turn it off cause it's too massive. Having so much fun, staring down a loaded gun.

Search engines already have optional AI use and often these days have AI use in the backend. They all seem to be running their own models, Brave's I found is the best and will at least give you sources of the information it digs up. Total removal of search engines would be a disaster though, so much would go wrong there and fast.

You bring up an important point there, that's what I meant with censorship partially and misinformation. AI algorithms can be weaponized and biased very quickly. Programmed to hold information from you or give you something completely different with negative intent. I'm pretty sure AI is the last thing I would listen to if it just straight up told me to consume something. AI can get extremely dangerous like that especially when it doesn't link out to sources or give you the list of information it is crawling.

It is heavily exploiting human or even animal brains like that in general. You see that pattern in life unfold in many ways. With humans this doesn't just restrict it self to food but other conveniences too and no one is free of it in the modern world. This is just reckless though, convenience and speed at all costs. It's comparable in ways to the the doordash/whatever else delivery service-ification of data. Delivery started out as something great on smaller scale applications like with grocery store specific services/chinese food/pizza but quickly when seen as something to extract large amounts of profit out of when expanded to everything under the sun became a disaster. Much more convenience but no one is actually happier and more miserable than ever.
 

Oliver3

Senior Member
Messages
1,169
I'm a songwriter. A.i. is an absolute nightmare for all the arts.
We don't even realise how our ears are becoming corrupted.
Some of the stuff, say by the stones, mid sixties era, sounds ' infantile to the average listener with today's freakish precision.
But worse I'd a.i ability to create music.
I'm hoping the human element will always win out, but as It progresses I'm complexity, the old human ways of creating might become obsolete
 

Oliver3

Senior Member
Messages
1,169
Using AI is major brain intensive work for me. It's different than if I had to slog away at things the way I used to. But no less rigorous--possibly more rigorous in some ways. And the results are many times more efficient and better than my old way of doing things. It puzzles me that more people don't see the endless possibilities of AI to improve our lives in many ways.

Not to mention getting an education on just about anything we could imagine. A quality education that compares (in my mind) to what people have paid tens of thousands of dollars for. And then there's health research, which is many times more efficient with AI than anything we've had previously. We can have conversations with AI that we could only dream of having with a doctor. And it doesn't gaslight us, and goes in whatever direction we ask it to. Using it is an exercise in creativity for me. And I intend to use it to maximize whatever benefits I can derive from it.
Using AI is deeply brain-intensive work for me—just as demanding, if not more so, than the traditional methods I used to rely on. It’s a different kind of rigor, but no less substantial. The difference is in the results: far more efficient and, in many cases, better than what I achieved before.

I’m often puzzled that more people don’t fully grasp the endless possibilities AI offers for improving our lives. It’s not just a tool for productivity—it’s a gateway to high-quality education on virtually any subject we can imagine. In my view, the learning experience it provides rivals what many have paid tens of thousands of dollars for.

Then there’s the potential for health research. With AI, we can explore medical information and ask questions in ways that are far more efficient than anything we’ve had access to before. We can have deep, open-ended conversations with it—conversations we’ve often only wished we could have with a doctor. AI doesn’t dismiss or gaslight us; it follows our curiosity wherever it leads.

For me, using AI is an exercise in creativity. And I plan to keep using it to its fullest potential—maximizing every benefit it can offer.
Can you expand on using it creatively, it's not sophisticated enough yet to create art. It needs a human interface.
I stay well away from it in my arts stuff.
How can it replicate true art. It's not human
 
Back