I mean on a technical level. Are the devices that make up the infrastructure of the internet hardwired with IPv4? Is the firmware on these devices impossible to upgrade remotely?

If it’s just a matter of software or firmware then adoption should only take like a year but clearly that isn’t the case. So what specifically is stopping us?

  • NaibofTabr@infosec.pub
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    6 months ago

    The Internet is a fantastic example of building the airplane while you’re flying it. We can’t just put this thing on the ground and rebuild the engine, we’re in flight and there’s a lot riding on it.

    IPv4 was drafted in 1981 and adopted by ARPANET in 1983. For all practical purposes, there was no “internet” yet - which is to say that IPv4 predates the Internet.

    IPv6 was drafted in 1998, but wasn’t adopted as an official standard until 2017. The Internet had grown exponentionally long before any manufacturers were even considering implementing IPv6.

    There is a mountain of telecom infrastructure built over the past 40 years that still has legacy hardware bits scattered through it. There is a jungle of interdependency tangled through firmware and low-level software that no one living has any real understanding of. There is an ocean of application software that was built on assumptions about the underlying infrastructure that no one ever planned to be updatable, and the creators are long retired.

    Anyone want to take bets on how many pieces of slapdash web software out there use some hard-coded regex to pick IPv4 addresses out of strings? Good luck getting those things updated. IPv4 is going to be with us for a long time in the form of shared libraries, Nth-tier dependencies, and legacy hardware drivers.