Often times, yes. I don’t want to always have to have a USB key on me, but I always have access to MFA apps via my phone, watch, or laptop. I have no idea why you’re typing the code out instead of copying and pasting.
Just tippy-tappin’ my way across the internet.
Often times, yes. I don’t want to always have to have a USB key on me, but I always have access to MFA apps via my phone, watch, or laptop. I have no idea why you’re typing the code out instead of copying and pasting.
Not really. A 4K movie means nothing to 99% of people. Is it 4GB? 40? 400? How many can my phone hold? Or my computer?
This only makes things more understandable if you use a point of reference that everyone you’re talking to is familiar with. The fact that they had to then explain how big a 4K movie is in the article clearly shows that even they know that this doesn’t help people. It’s just a big flashy number.
Just for context, I’m a writer, I understand the point of using these abstract measures to give a frame of reference. But in this case, just giving the capacity in GB/TB would have been easier to understand. It just wouldn’t have been as sensational of a headline.
deleted by creator
What a useless headline. God forbid they just give the actual capacity rather than some abstract, bullshit, flexible measure that means nothing to anyone.
But you’re still limited to the opinions of people who post on Lemmy, which, as someone who occasionally posts on Lemmy, is not a shining beacon of quality.
Even if I just went by what I get on the first page of a Google search, I’d expect I’d find what I need much faster using Google than I would using Lemmy based purely on the volume of info Google has access to. And that’s not even taking into account things like Google’s ability to search within other sites.
Unless Lemmy has gotten like 100 billions times better in the last week, this isn’t even a fair comparison.
Edit: lol, just realised you’re the same guy from the Nvidia thread.
Nvidia makes some of the best hardware for training AI models. Increased investment in AI will inevitably increase demand for Nvidia hardware. It may boost other hardware makers too, but Nvidia is getting the biggest boost by far.
Maybe I’m being dumb or missing something but this feels incredibly obvious to me.
LLM tools can already write basic code and will likely improve a lot, but there are more reasons to learn to code than to actually do coding yourself. Even just to be able to understand and verify the shit the AI tools spit out before pushing it live.
Nvidia knows that the more people who use AI tools, the more their hardware sells. They benefit directly from people not being able to code themselves or relying more on AI tools.
They want their magic line to keep going up. That’s all.
It makes no sense. AI tools will obviously have an impact on the profession development, but suggesting that no one should learn to code is like saying no one should learn to drive because one day cars will drive themselves. It’s utter self-serving nonsense.
This is objectively stupid. There are tonnes of things you learn in maths that are useful for everyday life even if you don’t do the actual calculations by hand.
I deleted all my posts before closing my accounts back when they were breaking third-party apps, although I’m sure they probably kept a private log of all posts specifically for this purpose.
To be honest, I expect AI companies are scraping Lemmy and other places for training data anyway, but I’d rather Reddit specifically not make any money off my posts.
Realistically, a couple of 10TB drives would have me covered for like a decade at least. If these massive drives bring down the price of much smaller ones, I’m a happy boy.
I don’t mean to be dismissive of your entire train of thought (I can’t follow a lot of it, probably because I’m not a dev and not familiar with a lot of the concepts you’re talking about) but all the things you’ve described that I can understand would require these tools to be a fuckload better, on an order we haven’t even begun to get close to yet, in order to not be super predictable.
It’s all wonderful in theory, but we’re not even close to what would be needed to even half-ass this stuff.
That remains to be seen. We have yet to see one of these things actually get good at anything, so we don’t know how hard that last part is to do. I don’t think we can assume there will be continuous linear progress. Maybe it’ll take one year, maybe it’ll take 10, maybe it’ll just never reach that point.
Writer here, absolutely not having this experience. Generative AI tools are bad at writing, but people generally have a pretty low bar for what they think is good enough.
These things are great if you care about tech demos and not quality of output. If you actually need the end result to be good though, you’re gonna be waiting a while.
This shit seems like the perfect basis for a discrimination lawsuit.
Maybe it’s time for a grown up CEO.
Web 4, now with Grift 2.
Web 4 — it’s web 3 but this time the grift is slightly different.
There are high-schoolers alive today (maybe even in this thread) who were born after Apple stopped charging for updates. Maybe it’s time to get some new jokes.
I’m gonna guess he has way more than two burners. The dude is an addict, tweeting is all he has.