also I just realized that Brazil did NOT make a programming language entirely in Spanish and call it “Si” and that my professor was making a joke about C… god damn it
this post is probably too nieche but I feel like Lemmy is nerdy enough that enough people will get it lol
Python is okay for some things. It’s just that software in general has become terrible because there is so much wasted power being used because people have access to fast hardware. In the 90s your entire environment would use a few MBs of ram. I know with high res images some of this stuff would increase but people are so wasteful with how they write stuff these days. We are evolving backwards because we spend hundreds or thousands on amazing hardware only to have it run like trash in a world where everything is written in java and python and electron. No longer do developers optimize. They just get their webpage to run at a inconsistent 30 FPS on your $2000 computer, and collect their 150k salary, on a machine that has more computing power than every computer in the world put together in the 90s.
It’s not just bad for your time and sanity. It’s bad for the environment, it’s bad for the economy, this same rot is working it’s way into operating systems, into game engines. Every game written for UE5 seems to run at 50 FPS regardless of how good your PC hardware is because of these same low quality programmers and terrible tools. Idk Linux to me has been a breath of fresh air in recent times as bad as it can be. It’s mostly C code with tiny binaries that are like 1-3 MB usually. I guess there is a silver lining to it in that all of these evil corporations like Google and meta and apple are dying because of this. Maybe the internet will go back to being centered around user content in a distributed fashion and not just a couple of highly controlled websites that try to brainwash you into supporting your corporate backed government. It already seems like every triple A game studio sucks and all the best games that have come out in the past 15 years have been from small indie studios.
Have you heard of the term “Software crisis”?
We don’t really talk all that much about it any more, because it’s become so normal, but the software crisis was the point where computers became faster than human programmers. That problem came up in the 1960.
Up until then a computer was simple enough that a single human being could actually understand everything that happened under the hood and could write near-optimal code by hand.
Since then computers doubled in performance and memory every few years, while developers have largely stayed human with human performance. It’s impossible for a single human being to understand everything that happens inside a computer.
That’s why ever since we have tried optimizing for developer time over execution time.
We have been using higher-level languages, frameworks, middlewares and so on to cut the time it takes to develop stuff.
I mean, sure, we could develop like in the 90s, the tools are all still there, but neither management nor customers would accept that, for multiple reasons:
So there’s more and more to do with less and less time and money.
We can square that circle by either reducing software quality into nothingness, or by using higher-level developer tools, that allow for faster and less error-prone develoment while utilizing the performance that still grows exponentially.
What would you choose?
But ultimately, it’s still the customer’s choice. You don’t have to use VSCode (which runs on Electron). You can still use KATE. You don’t have to use Windows or Gnome or MacOS. You can use Linux and run something like IceWM on it. You don’t have to use the newest MS Office, you can use Office 2013 or Libre Office.
For pretty much any Electron app out there, there’s a native alternative.
But it’s likely you don’t use them. Why is that? Do you actually prefer the flashy, pretty, newer alternative, that looks and feels better?
And maybe question why it feels so hard to pay €5 for a mobile app, and why you choose the free option over the paid one.
I actually pay for software but I run Linux, I’m not paying for windows because it’s bad. I prefer to pay then have ads. I value my time.
What you are saying is somewhat true. There are hundreds of thousands of programmers these days if not millions. The quality of the person who writes software just isn’t what it used to be. Not that they don’t work hard, but just that they aren’t capable of writing C.
You also can understand everything in a system, at least some people can. I understand those people are rare and expensive to hire.
One thing C really lacks is modern libraries to do these things. It’s not a limitation of C itself it’s just that most modern tools are targeted towards other languages. I understand that writing webapps in C isn’t the best idea because you don’t want web stuff running on hardware directly most of the time if you care about security anyways, but it’s really just a trend where the industry moved away from C with all of its frameworks and stuff which has not been good for the users.
Windows 98 was really good if you knew how it worked. I never had any issues really with stuff like XP. It always worked, it was always fast, it was always stable. I used XP for probably 10 years and never had any issues with instability and stuff and I was constantly modifying stuff, overclocking, patching drivers, modding bios, doing weird stuff that others didn’t do coming up with my own solutions. It worked really well. It’s modern windows that’s a buggy mess that crashes all the time.
To get back to the other point though, to move away from C was a mistake. It’s not that much more complicated than using other languages. Most of the complexity was just in setting up the environment which was admittedly terrible under C. Trying to link libraries and stuff. The actual code itself is not really that much more difficult than say python, but it’s a different paradigm. You are getting closer to the hardware, and it’s not automatic that your code is going to be cross platform unless you use platform agnostic libraries. It’s entirely possible to write multiplatform code in C and most programs could be written in a multiplatform way if users use libraries that target multiplatform development and let users compile them ahead of time. It’s just that companies like Microsoft created proprietary junk like .net and direct X which made writing multiplatform code much harder if you didn’t start with libraries like qt or gtk, and openGL. Again, this was never a fault of C. You could even have a standard in CPUs that would run any code to bootstrap a compiler and you could have platform agnostic binaries, which is just something that never happened because there was not really a point to it since so much code was written in lockdown .net and directx.
Interpreted language were intended to solve those issues. Making platform agnostic code, and to make code that was safe to run from websites without compromising the integrity of the users root filesystem, but these are terrible solutions. Especially as interpreted languages moved beyond web stuff and small simple apps to being used everywhere and integrated into every part of the system.
Python is a scripting language. It’s best used to call C libraries or to write very lightweight apps that don’t depend on low level hardware access. Java is like C but worse. JavaScript is like the worst of all worlds, strongly typed, verbose, picky about syntax, slow, interpreted, insecure, bloated, but it is cross platform which was originally probably why it was so popular. That should have just been added to C however. When you have code that runs 10x-10,000 times slower and you have bad programmers who don’t know how to write code that doesn’t destroy the bus, or use 100% of your system resources for no benefit, you end up in this mess we have today, for every app that uses 100% of your memory bandwidth, that halves the speed of the next program. If you have 3 programs running that peg then Emory bus, that means your next program is going to run at 0.25 the speed roughly. This is not how software should be written.
Python can also be great for prototyping algorithms and stuff, automating things that run once, not in loops. However once you figure it out, it should be written in C. All of these libraries that are written for the modern web should have been written to target C.
The cool thing about C is you can use it like basic if you really want. With a bit more syntax, but you don’t have to use it with classes. You can just allocate memory on stack and heap and then delete all of it with like one class if you really want to. Everything that’s cool about other languages mostly just already exists in C.
It’s kind of amazing to see the difference between a Linux smartphone and an android smartphone these days. A Linux smartphone running terrible hardware by today’s standard is just instant. 32 GBs of storage is enough to add everything you want to the operating systems because binaries are like 2 MB. Then that all goes away as soon as you open a web browser. A single website just kills it. Then you sit down on a modern windows machine and everything is slow and buggy as shit. It draws 500w of power on a 2nm process node. It’s a real issue. No amount of computer power will ever overcome interpreted languages because people will always do the minimum possible work to get it to run at an unstable 30 FPS and call it good.
No. No, you seriously can’t, not even if you are deploying to one single PC. Your code includes libraries and frameworks. With some studying, you might be able to familiarize yourself to the point where you know every single flow through the frameworks and libraries down to each line that’s being executed. Then it goes through the compiler. Compiler building is an art unto itself. Maybe there are a handful of people who understand everything GCC does in the roughly 200MB of its source code. But lets say you are a super crack programmer, who can memorize source code with about as many characters as 42x all of the Harry Potter books.
Now your code gets executed by the OS. If you are on Windows: Sucks to be you, because it’s all closed source. All you can manage to understand is the documentation, unless you decompile all of Windows. If you are on Linux you at least have the source code. That’s only 300MB of source code, shouldn’t be hard to completely understand everything in there and keep it in memory, right? And you aren’t running your code directly on the bare Linux kernel, so please memorize everything your DE and other relevant components do.
But we aren’t done yet, we are just through the software part. Hardware is important too, since it might or might not implement everything exactly like in the documentation. So break out your hex editor and reverse-engineer the latest microcode update, to figure out how your CPU translates your x64 calls to whatever architecture your CPU uses internally. An architecture that, btw, doesn’t have any public documentation at all. Might be time to break out the old electron microscope and figure out what the 20 billion transistors are doing on your CPU.
Now we are done, right? Wrong. The CPU is only one component in your system. Now figure out how all other components work. Did you know that both your GPU and your network interface controller are running full embedded operating systems inside them? None of that is publicly documented or open source, so back to the electron microscope and reading binary code in encrypted update files.
If you think all this knowledge fits into a single human’s brain in a way that this human actually knows what all of these components do in any given circumstance, then I don’t really know what to say here.
It’s not a matter of skill. It’s just plain impossible. It is likely easier to memorize every book ever written.
You can write webapps in C using Webassembly. Nobody does it because it takes much more time and has basically no upsides.
I would recommend that you revisit these old OSes if you think that. Fire it up in a VM and use it for a few weeks or so. Nostalgia is a hell of a drug. I did run Win98 for a while to emulate games, and believe me, your memory doesn’t reflect reality.
Reading what you are writing about programming, may I ask about your experience? It sounds to me like you dabbled in a bit of hobby coding a while ago, is that right?
Because your assessments don’t really make much sense otherwise.
No, the problem was not setting up the environment. The main problem with C is that it doesn’t do memory management for you and thus you constantly have to deal with stuff like buffer overflows, memory management issues, memory leaks, pointer overflows and so on. If you try to write past a buffer in any modern language, either the compiler or the runtime will catch it and throw an error. You cannot write past e.g. the length of an array Java, Python or any other higher-level language like that. C/C++ will happily let you write straight across the stack or heap, no questions asked. This leads to C programs being incredibly vulnerable to fitting attacks or instabilities. That’s the main issue with C/C++.
C is just as much “inherently multiplatform” as Python: Use pure C/Python without dependencies and your code is perfectly multi-platform. Include platform-specific dependencies and you are tied to a platform that supplies these dependencies. Simple as that. Same thing for every other language that isn’t specifically tied to a platform.
That standard exists, it’s called LLVM, and there are alternatives to that too. And there are enough platform agnostic binaries and stuff, but if you want to do platform-specific things (e.g. use a GPU or networking or threads or anything hardware- or OS-dependant) you need to do platform-specific stuff.
I don’t even know what kind of bus you are talking about. Emory bus is a bus line in Atlanta.
If you are talking about the PCIe bus, no worries, your python code is not hogging the PCIe bus or any other bus for that matter. It’s hard to even reply to this paragraph, since pretty much no single statement in there is based in fact.
You cannot use C with classes. That’s C++. C doesn’t have classes.
Hmm, nope. Linux smartphones run fast because they have no apps. Do a factory reset on your Android phone and disable all pre-installed apps. No matter what phone it is, it will run perfectly fast.
But if you run tons of apps with background processes, it will take performance.
I use Linux as my main OS, but I have Windows as a dual-boot system for rare cases. My PC draws 5w in idle on Windows or on Linux. The 500w what your PSU is rated for, or maybe what the PC can draw in full load with the GPU running at full speed (e.g. if you play a photo-realistic game), not what is used when the PC idles or just has a few hundred tabs in the browser open.
I’m on mobile so it’s hard to memorize all the things you wrote. Maybe I’ll clarify a few points that bothered me. You are obviously very knowledgeable in these things, even more than me in many areas. I am a hobbies not professional programmer but I have been programming since I was 12 or so and I’m in my 30s now, and also have always been a white hat hacker.
I don’t mean you can literally understand everything about a computer, just that you can understand everything you need to in order to do 99% of things and this isn’t some crazy thing. You would obviously use openGL or vulkan or direct X to access the GPU instead of writing binaries.
Modern machines do use several hundred watts just doing regular things. Not idle sure I less you have tons of junk running in the background, but even basic tasks on modern machines which utilize code written in languages like Python and Java and electron and web stuff, will absolutely use much of your systems hardware for simple tasks.
Managing memory in C++ is easy but you have to not be stupid. C++ isn’t stupid proof. It’s also not a good fit for some things because people make mistakes or just take advantage of the fact that C is low level and has direct access to exploit things. The issue is really that if you aren’t on a certain level of programming then c++ can be really unsafe. You need to understand concepts like creating your own node graph with inheritance to make managing memory easy. It is easy once you understand these things. Garbage collectors are not a good solution for many things. I would argue most things. It’s easy sure, but also buggy and breaks the idea of just having smooth running software. You should be freeing your memory just as you called it in an organized and thoughtful way.
By memory bus I mean the front side bus, which if you have programs running at uncapped speeds is bad or just programs running with 100x the overhead that they would have if written in C. Again this is just basic knowledge that any programmer should know without even being taught really. There is no reason to have programs bottleneck your machine when we live in an era of multitasking.
Writing code for C++ also doesn’t take longer after like 5 minutes, it’s actually much quicker because you can just write it and not have it complain about indentation or anything. It is a bit verbose with brackets and stuff but these are there to facilitate having a powerful language that can do pretty much anything. There is also string libraries and stuff that handle strings without the security issues.
Linux also is tiny without being devoid of software. It’s because it’s written in C and stuff is only as large as it needs to be. My entire Linux OS for my phone with all its files that I’m working on is less then 10 GBs and it has emulators, many libraries, many applications, several web browsers, wine, virtual machines, servers, several different development environments, different window managers, and all kinds of other stuff. On android installing a web browser could take hundreds of MBs for essentially zero benefit.
No benefits to web assembly? I guess to you that may be true because you don’t care about optimization, download size, energy use and stuff like this. It does have benefits, because one not everyone has thousands to upgrade their computer every two years, and where I live in a Republican state in America, the internet maxes out at 200 KBps, and on a good day maybe 500 KBs.
The first step on fixing a problem is admitting you have a problem. Software is only going to get worse if devs are in denial about it.
There’s one big difference between hobby work and professional work: If you do hobby stuff, you can spend as much time on it as you want and you are doing what you want. So likely, you will do the best you can do with your skill level, and you are done when you are done, or when you give up and stop caring.
If you do professional work, there’s a budget and a deadline. There’s a dozen things you need to do RIGHT NOW and that need to be finished until yesterday. There’s not one person working on things, but dozens and your code doesn’t live for weeks or months but for years or decades and will be worked on by someone when you are long gone. It’s not rare to stumble upon 10 or 20 years old code in most bigger applications. There are parts of the Linux kernel that are 30 years old.
Also in professional work, you have non-technical managers dictating what to do and often even the technical implementation. You have rapidly shifting design goals, stuff needs to be implemented in crunch time, but then gets cancelled a day before release. Systems are way more complex than they need to be.
I am currently working on the backend of the website and app for a large retail business. The project is simple, really. Get content from the content managers, display a webside with a webshop, handle user logins and loyalty program data. Not a ton of stuff.
Until you realize:
We are trying to overhaul this right now, and we just had a meeting last week, where we got someone from all of these teams around a table to figure out how different calls to the customer database actually work. It took us 6 hours and 15 people just to reverse-engineer the flow of two separate REST calls.
If you see bugs and issues in a software, that’s hardly ever due to bad programmers, but due to bad organizations and bad management.
This is exactly what the software crisis is, btw. With infinite time and infinite brain capacity, one could program optimally. But we don’t have infinite time, we don’t have infinite budget, and while processors get faster each year, developers just disappointingly stay human.
So we abstract stuff away. Java is slower than C (though not by a ton), mostly because it has managed memory. Managed memory means no memory management issues. That’s a whole load of potential bugs, vulnerabilities and issues just removed by changing the language. Multitasking is also much, much easier in Java than in C.
Now you choose a framework like Spring Boot. Yes, it’s big, yes you won’t use most of it, but it also means you don’t need to reimplement REST and request handling. Another chunk of work and potential bugs just removed by installing a dependency. And so on.
Put it differently: How much does a let’s say 20% slow down due to managed memory cost a corporation?
How much does a critical security vulnerability due to a buffer overflow cost a corporation?
Hobby development and professional development aren’t that similar when it comes to all that stuff.
Maybe it’s sort of a tragedy of the commons thing. Maybe new standards should have support for limiting the resource use of stuff and the defaults would be low enough that it would force companies to allow time to write good code or it will be unusable on 90% of machines. This might actually fix the issue. Companies could force programmers to just churn out terrible code as fast as possible but would have to actually allow time to optimize and clean up. Idk. I just deal with it by avoiding all that stuff because I actually have enough willpower to stop using something even when it’s more convenient out of principle, which I realize is rare. Most people just want their tik tok OS and they don’t care if they have to pay $1000 for a device that’s a glorified streaming media player. I’m glad Linux exists and it’s still written in C. I’m going to release a game some day and Im going to target Linux as the native client and I don’t care if I lose 80% of my customers. I want to be part of the solution and not the problem, but I understand survival and keeping a job is important to someone like you. Anyways good talk, and windows XP and 7 were much better then any modern operating system ever will be. Linux is catching up fast and we will probably all be on Linux running C code before long with the state of the industry. I can’t even use windows anymore. Much of the web is becoming that way as well. Purely profit driven, run by publicly traded companies that hate humans. Always brownosing the state and their corporate sponsors so the gestapo doesn’t come for their profits next.