

because i care about the wellbeing of cats? tf are you even talking about
i should be gripping rat
because i care about the wellbeing of cats? tf are you even talking about
Hi Penguin, the alt text in the body was added by me after the fact. That alt text is actually the body text if the post, the image itself apparently has no alt text.
sorry for having a moral backbone ig. i have two cats that are traumatized from living outdoors. When the front door is open, they don’t try to bolt, they stay put because they know they have it good. The outdoors is not good for domesticated cats, and domesticated cats are not good for local wildlife. There’s zero reason to send them outdoors, unless you like inflicting harm on animals. I am very passionate about this, and it makes me very sad that people dump their babies outdoors because they don’t want to spend the time and effort to provide them with enrichment.
Agreed, if you can’t keep them inside you shouldn’t get a cat.
Cars shouldn’t be outdoor pets. They are an invasive species in most parts of the world, and devestate local wildlife. Keep them inside.
thank you for that check. here’s what i am seeing on browser:
Beehaw just did a version upgrade last night, so there could be some server weirdness still.
there is a new “alt text” box when i am posting, and I entered that alt text, but it doesn’t seem like that alt text is showing up anywhere. Any clues about this?
Big article, but a great read! Some key excerpts:
This isn’t simply the norm of a digital world. It’s unique to AI, and a marked departure from Big Tech’s electricity appetite in the recent past. From 2005 to 2017, the amount of electricity going to data centers remained quite flat thanks to increases in efficiency, despite the construction of armies of new data centers to serve the rise of cloud-based online services, from Facebook to Netflix. In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023. The latest reports show that 4.4% of all the energy in the US now goes toward data centers. Given the direction AI is headed—more personalized, able to reason and solve complex problems on our behalf, and everywhere we look—it’s likely that our AI footprint today is the smallest it will ever be. According to new projections published by Lawrence Berkeley National Laboratory in December, by 2028 more than half of the electricity going to data centers will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households.
Let’s say you’re running a marathon as a charity runner and organizing a fundraiser to support your cause. You ask an AI model 15 questions about the best way to fundraise. Then you make 10 attempts at an image for your flyer before you get one you are happy with, and three attempts at a five-second video to post on Instagram. You’d use about 2.9 kilowatt-hours of electricity—enough to ride over 100 miles on an e-bike (or around 10 miles in the average electric vehicle) or run the microwave for over three and a half hours.
One can do some very rough math to estimate the energy impact. In February the AI research firm Epoch AI published an estimate of how much energy is used for a single ChatGPT query—an estimate that, as discussed, makes lots of assumptions that can’t be verified. Still, they calculated about 0.3 watt-hours, or 1,080 joules, per message. This falls in between our estimates for the smallest and largest Meta Llama models (and experts we consulted say that if anything, the real number is likely higher, not lower).
One billion of these every day for a year would mean over 109 gigawatt-hours of electricity, enough to power 10,400 US homes for a year. If we add images and imagine that generating each one requires as much energy as it does with our high-quality image models, it’d mean an additional 35 gigawatt-hours, enough to power another 3,300 homes for a year. This is on top of the energy demands of OpenAI’s other products, like video generators, and that for all the other AI companies and startups.
But here’s the problem: These estimates don’t capture the near future of how we’ll use AI. In that future, we won’t simply ping AI models with a question or two throughout the day, or have them generate a photo. Instead, leading labs are racing us toward a world where AI “agents” perform tasks for us without our supervising their every move. We will speak to models in voice mode, chat with companions for 2 hours a day, and point our phone cameras at our surroundings in video mode. We will give complex tasks to so-called “reasoning models” that work through tasks logically but have been found to require 43 times more energy for simple problems, or “deep research” models that spend hours creating reports for us. We will have AI models that are “personalized” by training on our data and preferences.
By 2028, the researchers estimate, the power going to AI-specific purposes will rise to between 165 and 326 terawatt-hours per year. That’s more than all electricity currently used by US data centers for all purposes; it’s enough to power 22% of US households each year. That could generate the same emissions as driving over 300 billion miles—over 1,600 round trips to the sun from Earth.
Big article, but a great read! Some key excerpts:
This isn’t simply the norm of a digital world. It’s unique to AI, and a marked departure from Big Tech’s electricity appetite in the recent past. From 2005 to 2017, the amount of electricity going to data centers remained quite flat thanks to increases in efficiency, despite the construction of armies of new data centers to serve the rise of cloud-based online services, from Facebook to Netflix. In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023. The latest reports show that 4.4% of all the energy in the US now goes toward data centers. Given the direction AI is headed—more personalized, able to reason and solve complex problems on our behalf, and everywhere we look—it’s likely that our AI footprint today is the smallest it will ever be. According to new projections published by Lawrence Berkeley National Laboratory in December, by 2028 more than half of the electricity going to data centers will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households.
Let’s say you’re running a marathon as a charity runner and organizing a fundraiser to support your cause. You ask an AI model 15 questions about the best way to fundraise. Then you make 10 attempts at an image for your flyer before you get one you are happy with, and three attempts at a five-second video to post on Instagram. You’d use about 2.9 kilowatt-hours of electricity—enough to ride over 100 miles on an e-bike (or around 10 miles in the average electric vehicle) or run the microwave for over three and a half hours.
One can do some very rough math to estimate the energy impact. In February the AI research firm Epoch AI published an estimate of how much energy is used for a single ChatGPT query—an estimate that, as discussed, makes lots of assumptions that can’t be verified. Still, they calculated about 0.3 watt-hours, or 1,080 joules, per message. This falls in between our estimates for the smallest and largest Meta Llama models (and experts we consulted say that if anything, the real number is likely higher, not lower).
One billion of these every day for a year would mean over 109 gigawatt-hours of electricity, enough to power 10,400 US homes for a year. If we add images and imagine that generating each one requires as much energy as it does with our high-quality image models, it’d mean an additional 35 gigawatt-hours, enough to power another 3,300 homes for a year. This is on top of the energy demands of OpenAI’s other products, like video generators, and that for all the other AI companies and startups.
But here’s the problem: These estimates don’t capture the near future of how we’ll use AI. In that future, we won’t simply ping AI models with a question or two throughout the day, or have them generate a photo. Instead, leading labs are racing us toward a world where AI “agents” perform tasks for us without our supervising their every move. We will speak to models in voice mode, chat with companions for 2 hours a day, and point our phone cameras at our surroundings in video mode. We will give complex tasks to so-called “reasoning models” that work through tasks logically but have been found to require 43 times more energy for simple problems, or “deep research” models that spend hours creating reports for us. We will have AI models that are “personalized” by training on our data and preferences.
By 2028, the researchers estimate, the power going to AI-specific purposes will rise to between 165 and 326 terawatt-hours per year. That’s more than all electricity currently used by US data centers for all purposes; it’s enough to power 22% of US households each year. That could generate the same emissions as driving over 300 billion miles—over 1,600 round trips to the sun from Earth.
That is a fair point! That could be neat. Still not worth the environmental cost of using this technology, but interesting in a vacuum!
don’t worry, the AI can do that badly too!
developers have been working on this, but it doesn’t scale to games in the way you might think. For one, games have to communicate with data centers to process LLMs, so we will still have to deal with the lag of data transmission and processing. The other problem is that, in general, the AI are not very good. ChatGPT has all the hype because it is very convincing, but it does not actually know what it is talking about. Go ask ChatGPT to add up 5 multi-digit numbers and watch it fail at a task that your pocket calculator can complete in seconds. All these LLMs are doing is taking your input and spitting out a response that sounds correct based on how people usually respond to that input. In the context of a game, this means that any dynamic conversation you might have with an NPC would go flying off the rails in ways that would make a game feel broken and/or unfinished. Go watch the video in the linked article and make your own judgment.
interesting to discover that MAGA does have a line for what they are willing to accept. Up until this moment I truly thought he could say or do whatever he wants and MAGA would drink it down like castor oil. Also fascinating that Trump has brushed up against this line already, not even 6 months into this term. We are in for a long 4+ years…
the right detains people for simply holding different political views and talking about them online. Meanwhile, the Biden admin couldn’t even be bothered to file the papers for Trump’s actual treason against the United States.
Every Shotcut update is a win for the world. Shotcut is already perfect for small-scale video projects that don’t require all the bells and whistles. Every update brings it closer to feature parity with the Premieres and Vegases of the world.
from the very first line of the linked article:
When Australia holds its federal elections on Saturday, it’ll do so with the requirement that all eligible citizens head to the polls and vote. If they don’t, the Australian Electoral Commission will fine them $20 AUD (that’s roughly $13 USD).
Even just the excerpts that alyaza provided make it clear that the headline is more metaphorical than literal.
exactly what i was gonna post. it’s hidden by default now and you can set a username to further hide that identity
deleted by creator
my opinions are based in facts and data, my dude. My personal connection to my two cats makes me extra passionate about this, but if the data said it was good for them to be outside, that’s what i would be advocating for. Full stop.