also I just realized that Brazil did NOT make a programming language entirely in Spanish and call it “Si” and that my professor was making a joke about C… god damn it

this post is probably too nieche but I feel like Lemmy is nerdy enough that enough people will get it lol

    • cooligula@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      9
      ·
      9 hours ago

      Why would you hate on it? It has its usecases. You won’t build an OS in Python, but I’d much rather do data processing in Python than in C

        • cooligula@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          3 hours ago

          You just cannot do it, I’m afraid. Python is an interpreted language, and requires de CPython library to be translated into machine code so that it can then be run, but that requires an underlying OS that makes the calls. The closest thing would be micropython, which can be run inside the Linux kernel, but that’s about it. The only thing I can think of is using a custom compiler that would generate either C/C++ or assembly code from a Python script, and then compile it using a standard C/C++/assembly compiler.

      • PeriodicallyPedantic@lemmy.ca
        link
        fedilink
        arrow-up
        1
        ·
        4 hours ago

        I hate on it mainly for its lack of static typing.
        I tried building a HomeAssistant add-on in python, and it was not a good experience. Idk what IDE python devs usually use but VSCode did not provide much assistance.

        • cooligula@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 hours ago

          You can in fact statically type in Python. For example, defining variables:

          six: int = 6
          hello_world: str = "Hello World!"
          

          Or defining functions:

          def foo(x: int) -> int:
              return x**2
          

          If you only want to use static Python, you can use the mypy static checker:

          # Install mypy if you don’t have it
          pip install mypy
          
          # Run the checker on the file (e.g., example.py)
          mypy example.py
          
          • PeriodicallyPedantic@lemmy.ca
            link
            fedilink
            arrow-up
            1
            ·
            3 hours ago

            I was using that syntax, but nothing seemed to be checking it. Running an external app to get static checking done isn’t great, presumably there are extensions for common IDEs?

            But the poor vscode developer experience went beyond that. I attribute it to dynamic typing because most of my frustration was with the IDE’s inability to tell me the type of a given variable, and what functions/properties were accessable on it.

            I hope it’d be better on an IDE made specifically for python, although idk how many extensions I’d have to give up for it, and things like devcontainers.

        • buttnugget@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          4 hours ago

          I am currently taking a Python class and we are using PyCharm I’m not a developer, so I don’t know if it’s good yet.

  • hardcoreufo@lemmy.world
    link
    fedilink
    arrow-up
    7
    ·
    13 hours ago

    I try to avoid python for two main reasons. While coding, white spaces. Who thought that was a good idea? While using, shared dependancies, again who thought thay was a good idea? I have to use pipx or manually make a venv otherwise python scripts start breaking each other. May as well just package it with its own dependancies from the get go.

    • PlexSheep@infosec.pub
      link
      fedilink
      arrow-up
      1
      ·
      2 hours ago

      Thats nit my problem with it. My problem is that it’s too dynamic, especially that I can’t have proper types

    • Kacarott@aussie.zone
      link
      fedilink
      arrow-up
      21
      ·
      10 hours ago

      I genuinely do not understand the problem with white spaces that people seem to have. Literally any well formatted code will use whitespace for indentation.

      I imagine that if python syntax was the norm and then a C-style syntax language appeared, the same group of people would be complaining “curly brackets? Who thought that was a good idea?”

      • Pup Biru@aussie.zone
        link
        fedilink
        English
        arrow-up
        5
        ·
        7 hours ago

        right? like if white space weren’t required, how would you format your code differently? arbitrary white space all over the place? no indentation? that is some spicy garbage code

  • edinbruh@feddit.it
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    1 day ago

    I’ll be honest, I think modern python is cool. You just need to accept that it has some limitations by design, but they mostly makes sense for its purpose.

    It’s true that the type system is optional, but it gets more and more expressive with every version, it’s honestly quite cool. I wish Pylance were a bit smarter though, it sometimes fails to infer sum types in if-else statements.

    After a couple large-ish personal projects I have concluded that the problem of python isn’t the language, but the users.

    On the other hand, C’s design is barren. Sure, it works, it does the thing, it gives you very low level control. But there is nothing of note in the design, if not some quirks of the specifications. Being devoid of innovation is its strength and weakness.

    • realitista@lemmus.org
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      1 day ago

      As someone who studied C exclusively in school and used it for the majority of programming projects I had in the real world, coming to Python now is like moving from a kit car like a Caterham to a Mercedes S class.

      • JoeBigelow@lemmy.ca
        link
        fedilink
        arrow-up
        6
        ·
        19 hours ago

        Knowing nothing about code but a fair bit about cars, does that analogy mean like, you can play around with a kit car all you want because you built it, it’s relatively simple, and if you break it you’ll know why and how to fix it, and the Mercedes being the exact opposite?

        • Pup Biru@aussie.zone
          link
          fedilink
          English
          arrow-up
          2
          ·
          7 hours ago

          i’d say that’s mostly reasonable… not to say you can’t mess around in the guts of python, but you can mess around a lot more in c

          the flip side of this is that python has a lot more guard rails: it’s simply impossible to write entire classes of sometimes very dangerous and subtle bugs in python code, while in c… go for it! that’s valid operations that you may have decided to do for performance reasons (also a reasonable argument, but if you know you’re not doing this fuckery then maybe it’s better to just let software not let you do them by accident or on purpose)

        • Jarix@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          9 hours ago

          I think it’s not about using it, not the mechanics of using it. No putting together required. And you have everything you need to start driving it the moment you get the keys.

          I’m not a mechanic or a programmer so I’m also curious what the meant lol

    • DarkAri@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      9
      arrow-down
      5
      ·
      edit-2
      1 day ago

      C does one thing really well and that’s everything fast with complete control. Python is cool for people just trying to bang out some scripts or learning to program but interpreted languages have no place in mainstream software. Devices are starting to become slower than computers 30 years ago because there is so much garbage being included in apps written in interpreted java and Python and other nonsense. It’s not just bad for the user but it’s bad for the planet. It shouldn’t take a million times the energy to run a simple program because someone doesn’t know how to write in a proper language. Python is okay for some things. The world has become too reliant on it though. Also just for purely selfish reasons if you are the type. Interpreted languages kill your battery life and ram and stuff. Modern android phones besides all their problems with Google ruining them like Microsoft are also just becoming incredibly slow and stupid. You can barely even open two apps without most android phones panicking and closing apps to save memory. A calculator app is 100 MBs now. The phone feels like it’s going to catch on fire when you open a notepad.

      • Jarix@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        9 hours ago

        For us ludite lurkers who couldn’t figure it out from context alone, which one is the interpreted language? I got lost on that detail lol

        • DarkAri@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          2
          ·
          3 hours ago

          Interpreted languages are languages that are compiled at run time. Compiled languages are compiled into binary by a compiler and then distributed as binaries.

          Basically with interpreted languages, there is huge overhead because the code has to be compiled (turned into machine code) as the program is running. This is why games and operating systems are written in C but people learn how to write Python and Java in their college classes. Interpreted languages are often much easier to learn then C and cross platform, but C is fast and powerful.

      • edinbruh@feddit.it
        link
        fedilink
        English
        arrow-up
        8
        ·
        19 hours ago

        I like many of your points, but your comment is facetious.

        You said it yourself, “it’s good for someone trying to bang out scripts”… and that’s it, that’s the main point, that’s the purpose of python. I will argue over my dead body that python is a trillion times better than sh/bash/zsh/fish/bat/powershell/whatever for writing scripts in all aspects except availability and if that’s a concern, the only options are the old Unix shell and bat (even with powershell you never know if you are stuck ps 5 or can use ps 7).

        I have a python script running 24/7 on a raspberry that listens on some mqtt topics and reacts accordingly asynchronously. It uses like 15kiB (literally less than 4 pages) of ram mostly for the interpreter, and it’s plenty responsive. It uses about two minutes of CPU time a day. I could have written it in rust or go, I know enough of both to do it, it would have been faster and more efficient, but it would have taken three times the time to write, and it would have been a bitch to modify, I could have done it in C and it would have been even worse. For that little extra efficiency it makes no sense.

        You argue it has no place in mainstream software, but that’s not really a matter of python, more a matter of bad software engineers. Ok, cool that you recognise the issue, but I’d rather you went after the million people shipping a full browser in every GUI application, than to the guys wasting 10 kiB of your ram to run python. And even in that case, it’s not an issue of JavaScript, but an issue of bad practices.

        P.S. “does one thing well” is a smokescreen to hide doing less stuff, you shouldn’t base your whole design philosophy on a quote from the 70s. That is the kind of shit SystemD hater shout, while running a display server that also manages input, opengl, a widget toolkit, remote desktop, and the entire printer stack. The more a high profile tool does, the less your janky glue code scripts need to do.

        • DarkAri@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          2
          ·
          12 hours ago

          Python is okay for some things. It’s just that software in general has become terrible because there is so much wasted power being used because people have access to fast hardware. In the 90s your entire environment would use a few MBs of ram. I know with high res images some of this stuff would increase but people are so wasteful with how they write stuff these days. We are evolving backwards because we spend hundreds or thousands on amazing hardware only to have it run like trash in a world where everything is written in java and python and electron. No longer do developers optimize. They just get their webpage to run at a inconsistent 30 FPS on your $2000 computer, and collect their 150k salary, on a machine that has more computing power than every computer in the world put together in the 90s.

          It’s not just bad for your time and sanity. It’s bad for the environment, it’s bad for the economy, this same rot is working it’s way into operating systems, into game engines. Every game written for UE5 seems to run at 50 FPS regardless of how good your PC hardware is because of these same low quality programmers and terrible tools. Idk Linux to me has been a breath of fresh air in recent times as bad as it can be. It’s mostly C code with tiny binaries that are like 1-3 MB usually. I guess there is a silver lining to it in that all of these evil corporations like Google and meta and apple are dying because of this. Maybe the internet will go back to being centered around user content in a distributed fashion and not just a couple of highly controlled websites that try to brainwash you into supporting your corporate backed government. It already seems like every triple A game studio sucks and all the best games that have come out in the past 15 years have been from small indie studios.

          • squaresinger@lemmy.world
            link
            fedilink
            arrow-up
            5
            ·
            9 hours ago

            Have you heard of the term “Software crisis”?

            We don’t really talk all that much about it any more, because it’s become so normal, but the software crisis was the point where computers became faster than human programmers. That problem came up in the 1960.

            Up until then a computer was simple enough that a single human being could actually understand everything that happened under the hood and could write near-optimal code by hand.

            Since then computers doubled in performance and memory every few years, while developers have largely stayed human with human performance. It’s impossible for a single human being to understand everything that happens inside a computer.

            That’s why ever since we have tried optimizing for developer time over execution time.

            We have been using higher-level languages, frameworks, middlewares and so on to cut the time it takes to develop stuff.

            I mean, sure, we could develop like in the 90s, the tools are all still there, but neither management nor customers would accept that, for multiple reasons:

            • Everyone wants flashy, pretty, animated things. That takes an ungodly amount of performance and work. Nobody is ok with just simple flat unanimated stuff, let alone text-based tools.
            • Stuff needs to run on all sorts of devices: ARM smartphones, ARM/x86 tablets, ARM/x86 PCs, all supporting various versions of Windows, Mac, Android, iOS and preferrably also Linux. But we also need a web version, at best running on Chrome, Firefox and Safari. You could develop all of these apps natively, but then you’d need roughly 20 apps, all of them developed natively by dedicated experts. Or you develop the app on top of browsers/electron and have a single app.
            • Stuff needs to work. Win95-level garbage software is not ok any more. If you remember Win95/98 fondly, I urge you to boot it up in a virtual machine some time. That shit is unacceptably buggy. Every time the OS crashes (which happens all the time) you are playing russian roulette with your partition.
            • Did I mention that everything needs to be free? Nobody wants to pay for software any more. Win95 was $199 ($432 in 2025 money) and Office was $499 ($1061 in 2025 money). Would you pay 1.5k just for Win11 and the current office?

            So there’s more and more to do with less and less time and money.

            We can square that circle by either reducing software quality into nothingness, or by using higher-level developer tools, that allow for faster and less error-prone develoment while utilizing the performance that still grows exponentially.

            What would you choose?


            But ultimately, it’s still the customer’s choice. You don’t have to use VSCode (which runs on Electron). You can still use KATE. You don’t have to use Windows or Gnome or MacOS. You can use Linux and run something like IceWM on it. You don’t have to use the newest MS Office, you can use Office 2013 or Libre Office.

            For pretty much any Electron app out there, there’s a native alternative.

            But it’s likely you don’t use them. Why is that? Do you actually prefer the flashy, pretty, newer alternative, that looks and feels better?

            And maybe question why it feels so hard to pay €5 for a mobile app, and why you choose the free option over the paid one.

            • DarkAri@lemmy.blahaj.zone
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              2 hours ago

              I actually pay for software but I run Linux, I’m not paying for windows because it’s bad. I prefer to pay then have ads. I value my time.

              What you are saying is somewhat true. There are hundreds of thousands of programmers these days if not millions. The quality of the person who writes software just isn’t what it used to be. Not that they don’t work hard, but just that they aren’t capable of writing C.

              You also can understand everything in a system, at least some people can. I understand those people are rare and expensive to hire.

              One thing C really lacks is modern libraries to do these things. It’s not a limitation of C itself it’s just that most modern tools are targeted towards other languages. I understand that writing webapps in C isn’t the best idea because you don’t want web stuff running on hardware directly most of the time if you care about security anyways, but it’s really just a trend where the industry moved away from C with all of its frameworks and stuff which has not been good for the users.

              Windows 98 was really good if you knew how it worked. I never had any issues really with stuff like XP. It always worked, it was always fast, it was always stable. I used XP for probably 10 years and never had any issues with instability and stuff and I was constantly modifying stuff, overclocking, patching drivers, modding bios, doing weird stuff that others didn’t do coming up with my own solutions. It worked really well. It’s modern windows that’s a buggy mess that crashes all the time.

              To get back to the other point though, to move away from C was a mistake. It’s not that much more complicated than using other languages. Most of the complexity was just in setting up the environment which was admittedly terrible under C. Trying to link libraries and stuff. The actual code itself is not really that much more difficult than say python, but it’s a different paradigm. You are getting closer to the hardware, and it’s not automatic that your code is going to be cross platform unless you use platform agnostic libraries. It’s entirely possible to write multiplatform code in C and most programs could be written in a multiplatform way if users use libraries that target multiplatform development and let users compile them ahead of time. It’s just that companies like Microsoft created proprietary junk like .net and direct X which made writing multiplatform code much harder if you didn’t start with libraries like qt or gtk, and openGL. Again, this was never a fault of C. You could even have a standard in CPUs that would run any code to bootstrap a compiler and you could have platform agnostic binaries, which is just something that never happened because there was not really a point to it since so much code was written in lockdown .net and directx.

              Interpreted language were intended to solve those issues. Making platform agnostic code, and to make code that was safe to run from websites without compromising the integrity of the users root filesystem, but these are terrible solutions. Especially as interpreted languages moved beyond web stuff and small simple apps to being used everywhere and integrated into every part of the system.

              Python is a scripting language. It’s best used to call C libraries or to write very lightweight apps that don’t depend on low level hardware access. Java is like C but worse. JavaScript is like the worst of all worlds, strongly typed, verbose, picky about syntax, slow, interpreted, insecure, bloated, but it is cross platform which was originally probably why it was so popular. That should have just been added to C however. When you have code that runs 10x-10,000 times slower and you have bad programmers who don’t know how to write code that doesn’t destroy the bus, or use 100% of your system resources for no benefit, you end up in this mess we have today, for every app that uses 100% of your memory bandwidth, that halves the speed of the next program. If you have 3 programs running that peg then Emory bus, that means your next program is going to run at 0.25 the speed roughly. This is not how software should be written.

              Python can also be great for prototyping algorithms and stuff, automating things that run once, not in loops. However once you figure it out, it should be written in C. All of these libraries that are written for the modern web should have been written to target C.

              The cool thing about C is you can use it like basic if you really want. With a bit more syntax, but you don’t have to use it with classes. You can just allocate memory on stack and heap and then delete all of it with like one class if you really want to. Everything that’s cool about other languages mostly just already exists in C.

              It’s kind of amazing to see the difference between a Linux smartphone and an android smartphone these days. A Linux smartphone running terrible hardware by today’s standard is just instant. 32 GBs of storage is enough to add everything you want to the operating systems because binaries are like 2 MB. Then that all goes away as soon as you open a web browser. A single website just kills it. Then you sit down on a modern windows machine and everything is slow and buggy as shit. It draws 500w of power on a 2nm process node. It’s a real issue. No amount of computer power will ever overcome interpreted languages because people will always do the minimum possible work to get it to run at an unstable 30 FPS and call it good.

              • squaresinger@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                37 minutes ago

                You also can understand everything in a system, at least some people can. I understand those people are rare and expensive to hire.

                No. No, you seriously can’t, not even if you are deploying to one single PC. Your code includes libraries and frameworks. With some studying, you might be able to familiarize yourself to the point where you know every single flow through the frameworks and libraries down to each line that’s being executed. Then it goes through the compiler. Compiler building is an art unto itself. Maybe there are a handful of people who understand everything GCC does in the roughly 200MB of its source code. But lets say you are a super crack programmer, who can memorize source code with about as many characters as 42x all of the Harry Potter books.

                Now your code gets executed by the OS. If you are on Windows: Sucks to be you, because it’s all closed source. All you can manage to understand is the documentation, unless you decompile all of Windows. If you are on Linux you at least have the source code. That’s only 300MB of source code, shouldn’t be hard to completely understand everything in there and keep it in memory, right? And you aren’t running your code directly on the bare Linux kernel, so please memorize everything your DE and other relevant components do.

                But we aren’t done yet, we are just through the software part. Hardware is important too, since it might or might not implement everything exactly like in the documentation. So break out your hex editor and reverse-engineer the latest microcode update, to figure out how your CPU translates your x64 calls to whatever architecture your CPU uses internally. An architecture that, btw, doesn’t have any public documentation at all. Might be time to break out the old electron microscope and figure out what the 20 billion transistors are doing on your CPU.

                Now we are done, right? Wrong. The CPU is only one component in your system. Now figure out how all other components work. Did you know that both your GPU and your network interface controller are running full embedded operating systems inside them? None of that is publicly documented or open source, so back to the electron microscope and reading binary code in encrypted update files.

                If you think all this knowledge fits into a single human’s brain in a way that this human actually knows what all of these components do in any given circumstance, then I don’t really know what to say here.

                It’s not a matter of skill. It’s just plain impossible. It is likely easier to memorize every book ever written.

                One thing C really lacks is modern libraries to do these things. It’s not a limitation of C itself it’s just that most modern tools are targeted towards other languages. I understand that writing webapps in C isn’t the best idea because you don’t want web stuff running on hardware directly most of the time if you care about security anyways, but it’s really just a trend where the industry moved away from C with all of its frameworks and stuff which has not been good for the users.

                You can write webapps in C using Webassembly. Nobody does it because it takes much more time and has basically no upsides.

                Windows 98 was really good if you knew how it worked. I never had any issues really with stuff like XP. It always worked, it was always fast, it was always stable. I used XP for probably 10 years and never had any issues with instability and stuff and I was constantly modifying stuff, overclocking, patching drivers, modding bios, doing weird stuff that others didn’t do coming up with my own solutions. It worked really well. It’s modern windows that’s a buggy mess that crashes all the time.

                I would recommend that you revisit these old OSes if you think that. Fire it up in a VM and use it for a few weeks or so. Nostalgia is a hell of a drug. I did run Win98 for a while to emulate games, and believe me, your memory doesn’t reflect reality.


                Reading what you are writing about programming, may I ask about your experience? It sounds to me like you dabbled in a bit of hobby coding a while ago, is that right?

                Because your assessments don’t really make much sense otherwise.

                To get back to the other point though, to move away from C was a mistake. It’s not that much more complicated than using other languages. Most of the complexity was just in setting up the environment which was admittedly terrible under C. Trying to link libraries and stuff. The actual code itself is not really that much more difficult than say python, but it’s a different paradigm.

                No, the problem was not setting up the environment. The main problem with C is that it doesn’t do memory management for you and thus you constantly have to deal with stuff like buffer overflows, memory management issues, memory leaks, pointer overflows and so on. If you try to write past a buffer in any modern language, either the compiler or the runtime will catch it and throw an error. You cannot write past e.g. the length of an array Java, Python or any other higher-level language like that. C/C++ will happily let you write straight across the stack or heap, no questions asked. This leads to C programs being incredibly vulnerable to fitting attacks or instabilities. That’s the main issue with C/C++.

                and it’s not automatic that your code is going to be cross platform unless you use platform agnostic libraries. It’s entirely possible to write multiplatform code in C and most programs could be written in a multiplatform way if users use libraries that target multiplatform development and let users compile them ahead of time.

                C is just as much “inherently multiplatform” as Python: Use pure C/Python without dependencies and your code is perfectly multi-platform. Include platform-specific dependencies and you are tied to a platform that supplies these dependencies. Simple as that. Same thing for every other language that isn’t specifically tied to a platform.

                You could even have a standard in CPUs that would run any code to bootstrap a compiler and you could have platform agnostic binaries, which is just something that never happened because there was not really a point to it since so much code was written in lockdown .net and directx.

                That standard exists, it’s called LLVM, and there are alternatives to that too. And there are enough platform agnostic binaries and stuff, but if you want to do platform-specific things (e.g. use a GPU or networking or threads or anything hardware- or OS-dependant) you need to do platform-specific stuff.

                Python is a scripting language. It’s best used to call C libraries or to write very lightweight apps that don’t depend on low level hardware access. Java is like C but worse. JavaScript is like the worst of all worlds, strongly typed, verbose, picky about syntax, slow, interpreted, insecure, bloated, but it is cross platform which was originally probably why it was so popular. That should have just been added to C however. When you have code that runs 10x-10,000 times slower and you have bad programmers who don’t know how to write code that doesn’t destroy the bus, or use 100% of your system resources for no benefit, you end up in this mess we have today, for every app that uses 100% of your memory bandwidth, that halves the speed of the next program. If you have 3 programs running that peg then Emory bus, that means your next program is going to run at 0.25 the speed roughly. This is not how software should be written.

                I don’t even know what kind of bus you are talking about. Emory bus is a bus line in Atlanta.

                If you are talking about the PCIe bus, no worries, your python code is not hogging the PCIe bus or any other bus for that matter. It’s hard to even reply to this paragraph, since pretty much no single statement in there is based in fact.

                The cool thing about C is you can use it like basic if you really want. With a bit more syntax, but you don’t have to use it with classes. You can just allocate memory on stack and heap and then delete all of it with like one class if you really want to. Everything that’s cool about other languages mostly just already exists in C.

                You cannot use C with classes. That’s C++. C doesn’t have classes.

                It’s kind of amazing to see the difference between a Linux smartphone and an android smartphone these days. A Linux smartphone running terrible hardware by today’s standard is just instant. 32 GBs of storage is enough to add everything you want to the operating systems because binaries are like 2 MB. Then that all goes away as soon as you open a web browser. A single website just kills it.

                Hmm, nope. Linux smartphones run fast because they have no apps. Do a factory reset on your Android phone and disable all pre-installed apps. No matter what phone it is, it will run perfectly fast.

                But if you run tons of apps with background processes, it will take performance.

                Then you sit down on a modern windows machine and everything is slow and buggy as shit. It draws 500w of power on a 2nm process node. It’s a real issue. No amount of computer power will ever overcome interpreted languages because people will always do the minimum possible work to get it to run at an unstable 30 FPS and call it good.

                I use Linux as my main OS, but I have Windows as a dual-boot system for rare cases. My PC draws 5w in idle on Windows or on Linux. The 500w what your PSU is rated for, or maybe what the PC can draw in full load with the GPU running at full speed (e.g. if you play a photo-realistic game), not what is used when the PC idles or just has a few hundred tabs in the browser open.

                • DarkAri@lemmy.blahaj.zone
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  edit-2
                  9 minutes ago

                  I’m on mobile so it’s hard to memorize all the things you wrote. Maybe I’ll clarify a few points that bothered me. You are obviously very knowledgeable in these things, even more than me in many areas. I am a hobbies not professional programmer but I have been programming since I was 12 or so and I’m in my 30s now, and also have always been a white hat hacker.

                  I don’t mean you can literally understand everything about a computer, just that you can understand everything you need to in order to do 99% of things and this isn’t some crazy thing. You would obviously use openGL or vulkan or direct X to access the GPU instead of writing binaries.

                  Modern machines do use several hundred watts just doing regular things. Not idle sure I less you have tons of junk running in the background, but even basic tasks on modern machines which utilize code written in languages like Python and Java and electron and web stuff, will absolutely use much of your systems hardware for simple tasks.

                  Managing memory in C++ is easy but you have to not be stupid. C++ isn’t stupid proof. It’s also not a good fit for some things because people make mistakes or just take advantage of the fact that C is low level and has direct access to exploit things. The issue is really that if you aren’t on a certain level of programming then c++ can be really unsafe. You need to understand concepts like creating your own node graph with inheritance to make managing memory easy. It is easy once you understand these things. Garbage collectors are not a good solution for many things. I would argue most things. It’s easy sure, but also buggy and breaks the idea of just having smooth running software. You should be freeing your memory just as you called it in an organized and thoughtful way.

                  By memory bus I mean the front side bus, which if you have programs running at uncapped speeds is bad. Again this is just basic knowledge that any programmer should know without even being taught really. There is no reason to have programs bottleneck your machine when we live in an era of multitasking.

                  Brb in a min

      • realitista@lemmus.org
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        1
        ·
        edit-2
        1 day ago

        Uses less memory until it inevitably springs a memory leak. And its not a million times the memory, its ~10x. You should check out assembly language, it beats C in all the metrics you care about.

        • DarkAri@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          1
          arrow-down
          1
          ·
          edit-2
          11 hours ago

          Assembly isn’t really faster than C in many cases, although it is in some cases, C actually compiles to assembly basically. The speed ups you get from assembly come from the fact that you can hand optimize some things that compilers miss, and it can be useful sometimes, like when writing a very high performance part of software like a game rendering loop or something.

          Python uses 10x the memory but probably 100x-1000x the CPU cycles to do the same thing. Also using libraries written for interpreted languages is going to bloat your memory footprint where c libraries are tiny and efficient.

          Memory leaks are an issue with all programming languages, although some use what’s called a garbage collector to handle memory which can be okay with some things but terrible in other things like real time software, operating systems, video games, or just anything you don’t want to hitch and lag and run like a turd. Garbage collectors aren’t some magic fix to memory management, if so they would be a part of C. There are huge tradeoffs to not managing your own memory. If you are using c with objects, then you are pretty safe. The object oriented nature of the language makes it very easy to manage memory. That’s mostly what it’s there for besides reducing the amount of redundant code, if you are using inheritance like you are supposed to. This is called a node graph. You store your data under objects so when you want to remove your data you just call a recursive free function on the highest parent object.

          The difference really is that C code is efficient, in the sense that it doesn’t waste anything. Every thing that seems low level about C is there for a reason. It came from a time where it was important to write code efficiently. Where every MB and cycle counted. Where having a garbage collector freeing and potentially crashing your operating system was unacceptable as well as extremely slow. It’s still slow btw, because programs have scaled with the ability of hardware to run it, so garbage collectors are still mostly as terrible as they always have been.

          C is only low level in the sense that it actually runs on the hardware. There isn’t layers of stuff in-between it and the hardware. There is no good reason to do so, outside of maybe security in some context. You don’t want web resources running on your hardware directly.

          All the other stuff that comes with modern languages is mostly nonsense. Type checking is for lazy programmers. It multiplies the time needed to do an operation. There is no good reason for it to exist other than programmers being bad at their job. C is loosely typed btw. It checks types in the compiler where it belongs. If your android phone was written in c++, your battery would last for days, and you could play games on it for hours, and everything would be extremely fast, nearly instant loading of stuff. The reason web pages were written in JiT languages was mainly just for comparability across many different types of hardware and browsers. They were also relatively small programs. Scripting can be useful sometimes, garbage collectors can be useful for script kitty stuff. It has no place in mainstream software and definitely not in operating systems. Google went from “Don’t be evil” to let’s build an entire operating system out of java and spyware. It’s not good. At this rate we aren’t even going to have guis anymore in 10 years because no hardware will be able to run it without destroying itself, and needing to be plugged in constantly, and have $1000 worth of ram from some slave economy that has overpowered us as we have become so unproductive since most people are using windows 12 or some nonsense.

          • realitista@lemmus.org
            link
            fedilink
            English
            arrow-up
            3
            ·
            edit-2
            9 hours ago

            Python uses 10x the memory but probably 100x-1000x the CPU cycles to do the same thing. Also using libraries written for interpreted languages is going to bloat your memory footprint where c libraries are tiny and efficient.

            You’ve obviously never looked at benchmarks because you’re one or two orders of magnitude off.

            The same reason you don’t use assembly is the reason many use Payton instead of C.

            As someone who was trained in C and did most of my programming in it, yes it does everything you need but it’s a major pain in the ass doing it well. It’s slow to get things done and you need decades to get competent at it. Python allows you to get up and running a lot faster.

            As cpu and ram are cheap compared to the days when C was a necessity, most programmers have made the decision that getting things going fast and easy was worth the trade off. The market has spoken. There is still a place for C or Rust, but there’s also a place for Python and other interpreted languages. You can make good programs in both but it’s a lot easier to make a garbage program in C.

            I’ve used at least 20 computer OS’ dating back to the ‘70s, and despite all your fearmongering, computers keep getting cheaper and easier to use, and for the most part, faster. I’ve got old Macs and PC’s and Linux boxes laying around from 20-30 years ago, and trust me, they aren’t faster or easier to use. There were some good OS’ like AmigaOS or windowing systems like FVWM back in the day that were surprisingly responsive for the time, but Windows and MacOS were all pretty garbage until about windows 7 and Mac OS X. And they costed $4000+ in today’s dollars. You can get laptops these days for $150.

            • DarkAri@lemmy.blahaj.zone
              link
              fedilink
              arrow-up
              1
              ·
              3 hours ago

              Python is really that much slower. It has actually come a long way in the past few years but it’s still a interpreted, strongly typed language. You can use libraries that are written in C or rust or something to make python run much faster but anything you write in actual Python is extremely slow. It can be okay for scripting, like basically bash, but it’s not really good for a programing language and writing applications in it is not good unless it’s a small project made by one programmer that does a specific useful thing.

              Software really is getting terrible. We are hitting a wall in terms of refining process nodes further because we are at 2 nm and it’s really difficult to keep going. There is already way too much terrible code out there just destroying really powerful systems. We are evolving backwards, boot times in the early 2000s on low end hardware were a few seconds for windows XP. When I clicked an application, it either opened nearly instantly or within a couple seconds. It was a much better operating system than windows 10 ever will be.

              The issue is having even a single piece of python or java or electron can just completely saturate your memory bus and halve the speed of every operation you do. i had a PC that had spotty thermal paste but long ago and opening discord would overheat it lol.

              All I’m saying is that writing this type of code for production shouldn’t really be acceptable. It would be nice if we actually benefited from advancing computer technology and new hardware wasn’t just an excuse to write worse software. I think operating systems should warn the users when running terrible code, that this program is low quality and will slow down the system or is taking as much resources as it can. We are in the age of 1000w computers with billions of transitions being taken out by webpages and OS spyware. The standards are just far too low. There is too much terrible software being written because companies are desperate to hire people who have no idea how to program in real languages instead of paying for real programmers or helping people learn to code in those languages and many of these companies are billion dollar companies.

              Like I said, it’s bad for the user, it’s bad for the environment, it bad for your hardrives, and it’s bad for the economy. Not to go full terry Davis on you but computers should boot in under a second these days.

  • PhilipTheBucket@piefed.social
    link
    fedilink
    English
    arrow-up
    48
    arrow-down
    5
    ·
    1 day ago

    C is the old carpenter, who can drive in nails with three strikes of the hammer and never forgets his tools.

    C# is his friend who just uses power tools instead. He is fine too. He goes home early whenever he can.

    Python is the new guy at work who thinks he’s super smart. He actually can do the job really well, but for some reason nobody likes him all that much.

    Javascript is the boss’s son who got the job since he agreed to stay off pills but he does not. He is useful to be friendly with, maybe, but avoid him any day that you can. Typescript is his weird fiancée. She is significantly less stupid but much more rarely useful, and also best avoided.

    Go and Rust are tight-knit friends who get shit done. They are extremely capable but also not friendly, they tend not to talk much.

    Clojure does mushrooms on weekends, and seems to believe he has key insights the rest of the crew is too dim to understand, but he also makes frequent simple mistakes on the job and forgets things. Also avoid.

    Java only has the job because he’s known the boss since they were kids. He was never that good, but now he is old, and frequently drunk. Avoid at all costs.

    • pixeltree@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      31 minutes ago

      Nah java’s the super chatty guy who goes on and on and on and never shuts the hell up, making doing anything with him extra tedious

    • Capricorn_Geriatric@lemmy.world
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      8 hours ago

      Great story, but why antropomorphize? Would tools not be a better analogy?

      C is like having a box of nails, manual tools and some mortar. The Notre Dame was built like that.

      C# is like having a box of screws and some power tools. Some tools are still manual. This is how your grandma’s house was built.

      Java is similarily a mix of old and new, but you also have stuff like cement. That’s how new schools were built.

      Python is like having a modern hardware store at your disposal. Big, clunky, and you need a stroll down the isles before you find what could work. Should I use this power tool I know or a new one more specific for the usecase? What are those little plastic screw sleeve thingies? This is how modern homes are built.

      Javascript is like US power tools. When trying to switch to them you’ll question your sanity, but you can still get stuff done just as well. However, only power tools: Want to drive a nail into something? Gonna need a semiautomatic nailgun. Want to hit something hard? Can’t use a hammer, there’s a power tool for that. Oh, and your nails and screws are shapeshifting. This is how the Opera House was built.

      Type script is like javascript, but you retain your organized nailbox. For some reason, not many things were built with it.

      Go and rust are like metric, traditional tools with some screws. However, they’re labeled in chinese. In essence, it’s the same as your run-of-the-mill tools. It just takes some time to get to know them. This is how a hospital gets built in a week.

      I’ve never done Clojure, so wouldn’t know.

    • Otter@lemmy.ca
      link
      fedilink
      English
      arrow-up
      21
      ·
      1 day ago

      COBOL handles the books because no one else can understand the system and it’s too much work to change after 40 years

    • squaresinger@lemmy.world
      link
      fedilink
      arrow-up
      16
      arrow-down
      3
      ·
      1 day ago

      C is the old carpenter with leaky memory with heavy undiagnosed autism, who constantly cracks demented jokes like “Missing } at end of file”.

      He’s so mentally not there in fact, that if you don’t specifically tell him to return to you after finishing the job, he will neither figure out what he’s supposed to do, nor will he tell you what went wrong, but instead he will happily jump somewhere else, halucinate commands from the structure of the walls and start doing whatever the voices tell him to do.

          • PhilipTheBucket@piefed.social
            link
            fedilink
            English
            arrow-up
            1
            ·
            19 hours ago

            I mean yeah lol. That’s why I said “mostly.” But my point was, more or less, that modern power tools can do stuff that you simply can’t do with C, but C is still a venerable tool to me. I like it. The old pros can make fantastic custom cabinets, they do framing almost as fast as someone with a nail gun, it’s just that it’s not practical for most people to try to get skilled enough to be able to make solid stuff (and of course you can never make a skyscraper with just hand tools.)

            Once you start finding yourself using malloc() all that much, you’re probably using the wrong tool, and it’s also just objectively less secure than other safer languages. But clean C code has a kind of beauty to me that is hard to replicate in the more powerful languages.

    • Jankatarch@lemmy.world
      link
      fedilink
      arrow-up
      16
      arrow-down
      1
      ·
      edit-2
      1 day ago

      Rust is that one rare type of guy who refuses to round measurements so you end up with “the drawer is 28.34646 inches tall.”

      Clojure one is perfect lmao.

    • boletus@sh.itjust.works
      link
      fedilink
      arrow-up
      5
      arrow-down
      1
      ·
      1 day ago

      Yeah and the new guy takes 3 days to finish the job that the old carpenter can do in 2hrs. And when he wants it done faster he quickly asks the old guy to do it for him. That’s why nobody else in the site likes him.

    • A_A@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      15 hours ago

      “The C programming language is like debating a philosopher and Python is like debating someone who ate an edible

      Here “edible” is a drug. it means Python is going in all and every directions like someone high on drugs.

    • balance8873@lemmy.myserv.one
      link
      fedilink
      arrow-up
      7
      ·
      24 hours ago

      That’s because it’s a stupid take. Believing brazil named a programming language after a Spanish word is pretty embarrassing too but I guess English speakers do that constantly.

  • Captain Aggravated@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    1
    ·
    1 day ago

    Python is my “native” programming language, it’s the first I learned, and many of my leaps in understanding of the language have resulted from thinking “Wait, Python is a smart ass. I bet it can do…”

  • Treczoks@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    1 day ago

    also I just realized that Brazil did NOT make a programming language entirely in Spanish and call it “Si”

    Imagine Python did this, and people would programming in Dutch!