This is just one of the many far reaching effects of the disinformation age we are headed into. It would not surprise me if, in the future (assuming humanity survives our climate crisis), this period of time will be contrasted with the middle ages as periods of great loss of human knowledge.
For what it’s worth, a lot of what the article is bringing up isn’t particularly new. Fake studies are nothing new, but the scope of them will definitely increase. While it is manpower intensive, this is easily solved by peer review. In fact, perhaps ironically, AI could be used to do a first-pass review before and summarize what seems like it was AI created versus human created and send that along to a human.
Corporation funded studies designed to get around regulation or to promote their product, on the other hand, is something we’ve been dealing with for quite some time and an issue we haven’t really solved as a society. Anyone who works in science knows how to spot these from a mile away because they’re nearly always published in tiny journals (low citation score) which either don’t do peer review or have shady guidelines. The richer companies have the money to simply run 40 or 50 studies concurrently and toss the data from every one that doesn’t have the outcome they’re looking for (in essence repeatedly rolling a d20 until they get a 20) which allows them to get their findings into a bigger journal because it’s all done above board. Some larger publishing journals have created guidelines to try and fight this, but ultimately you need meta-analyses and other researchers to disprove what’s going on, and that’s fairly costly.
Also, as an aside- this belongs in the science community more than tech in my opinion.