Buying up game developers to make them exclusives and selling hardware at a loss to stifle competitors is the only “benefit” their money has produced. This is a net negative for VR as a whole.
Like 90% of what a modern VR headset is made of has come from their money.
Like what? I can’t think of a single invention they pioneered that’s used in their own headsets, let alone everyone else’s.
Major one being inside out tracking, it was the whole reason for the division that led to them and the team at valve separating back in the day. What was then Oculus wanted to focus on computer vision tracking, rather than laser tracking because computer vision tracking had the better future, and wouldn’t take long before it surpassed where laser tracking was at anyway. But it also was the path forward to make inside out tracking possible any time soon.
They had to optimize the processor use of computer vision based tracking down to the point that not only could a mobile chip handle it, but a co-processor to a mobile chip. That optimization would have easily taken a decade longer without the compute power facebook had access to, to process all the data and machine learn better and better algorithms. Hand tracking was also insanely accelerated by the same. Notice how crappy the hand tracking is on literally everything else? Even the Apple Vision pro with their quadruple camera res have a hard time tracking hands anywhere near as accurately despite a significantly clearer image and infrared projection, and laptop levels of processing power. Not to mention, they rely on hand tracking only, as they have no other method of input/control. Their hand tracking should be awesome if it was easy to do. It’s not easy, and yet meta headsets make it look easy.
Pancake lenses are downright one of the most important advances in VR headsets lately. If you haven’t tried a headset with pancake lenses, get to an electronics store and do a demo with either the Quest pro or the Quest 3. The clarity is insane, and the total light retained is so high, the screens all of a sudden look 30% brighter. With no distracting artifacts.
The pro controllers optical reckoning is definitely the way of the future. It’s still a little expensive now, so you won’t see it in cheap headsets yet. But it won’t be long before that is just the right way to do controllers. They track just as accurately, except the headset doesn’t need to be able to see them. So you can aim a bow for more than a few seconds, or have your hands down at your sides when relaxing. There are a whole bunch of game mechanics that can’t currently be used in standalone VR games as opposed to PCVR because most controllers can’t be tracked in many locations.
They funded the creation, and took part in the design of -the- mobile XR chipset, and the second one. Almost all headsets use it now, rather than funding one of their own.
The way passthrough works on good headsets, recreating the world around you in 3D from the camera feeds, so that things can be properly occluded by and occlude stuff that is created by the headset. Would again have taken so much longer without access to the processing power at facebook. For now, it isn’t even on the released version of the headset software, it is still in internal testing for the most part, but it’s getting close. And it’s gonna be huge for mixed reality.
There are a bunch more individual pieces of headsets that still either wouldn’t exist or wouldn’t be cheap enough to make yet without meta’s R and D money. And a bunch more coming soon.
Here’s the thing, I hate meta just as much as anyone else. But I want the best hardware, you really can’t get it anywhere else, and it might be a long time before that changes. When it changes, I will buy that instead. I don’t need old games from my old headsets to carry forward to the new one. At the time, I’ll be playing 2 or 3 games at most, and I can choose if I want to buy those for the new headset or finish them on the current headset. Just like old consoles, if I want to play those old games again, I can either play them on the old console/headset or wait until there is a way to emulate old games on the new hardware. Or a remake.
And to address your other point about the exclusives, yes they bought some studios, but they also put a ton of money into other games without requiring the games to be exclusive to Oculus/meta headsets or stores, only to make sure there was at least a version for their headsets and store. Those companies could have made versions or ports for other stores and hardware, and many did. Some didn’t, as those versions didn’t get funded.
I stopped reading when you implied that Facebook invented pancake optics. They have been used in cameras for decades. And while I agree they’re the way forward in the future, saying they let more light in is factually incorrect: they only let about 10-15% of the light through. This page has a good overview of why that is and how they work.
Ok, well I guess I’m sorry I fell into your trap and got one thing wrong. You wanted me to list a bunch of stuff off the top of my head, I’m just some random guy. It doesn’t matter if the rest is right, I failed your test. I assumed since all the headsets with pancake lenses were so much brighter than the old ones without worse battery life that it must let more light through than the previous lens types did. But I guess it was some other advance in some other tech that let them run the screens 10x as bright at the same battery cost.
And I’m not sure what part made it sound like I thought they invented the very idea of pancake lenses. You wanted me to list the innovations they brought to VR. And making pancake lenses affordable for VR is very much a thing they did, by spending money. The money spent went to a company that was making them for other applications. But it was still money spent by meta to bring them to VR.
I may not have all the details perfect. But your take is that meta spending 25 billion dollars on progressing VR was unnoticeable and did nothing…
What? I didn’t want you to list a bunch of things off the top of your head. I asked for one factual thing, and you instead you provided a bunch of assumptions. If you can’t provide actual facts maybe just don’t state guesses like they’re true?
Sorry, I got from your “like what?” response to my statement that they had financed a wide variety of technologies for VR headsets that you wanted me to provide examples, not just one thing.
But when I was wrong about one of the things, it invalidated everything else anyway. So the other examples don’t count anymore.
I didn’t know it was supposed to be my thesis paper either. I’m sorry I didn’t put hours of research into what I thought was a conversation. If you would have asked me to make sure my sources were cited I probably could have put more effort in. And maybe I would have found out that I guessed one thing wrong. One thing that didn’t even actually matter towards the actual topic of the conversation, but that automatically invalidated all the other right things I said.
I drew the wrong conclusion from the screens being brighter, I assumed the lens must have helped with that, I will never make that mistake again. It turns out it was just some other tech advancement I previously didn’t know about, but could have added to the list if I had done all the research you wanted.
Buying up game developers to make them exclusives and selling hardware at a loss to stifle competitors is the only “benefit” their money has produced. This is a net negative for VR as a whole.
Like what? I can’t think of a single invention they pioneered that’s used in their own headsets, let alone everyone else’s.
Major one being inside out tracking, it was the whole reason for the division that led to them and the team at valve separating back in the day. What was then Oculus wanted to focus on computer vision tracking, rather than laser tracking because computer vision tracking had the better future, and wouldn’t take long before it surpassed where laser tracking was at anyway. But it also was the path forward to make inside out tracking possible any time soon.
They had to optimize the processor use of computer vision based tracking down to the point that not only could a mobile chip handle it, but a co-processor to a mobile chip. That optimization would have easily taken a decade longer without the compute power facebook had access to, to process all the data and machine learn better and better algorithms. Hand tracking was also insanely accelerated by the same. Notice how crappy the hand tracking is on literally everything else? Even the Apple Vision pro with their quadruple camera res have a hard time tracking hands anywhere near as accurately despite a significantly clearer image and infrared projection, and laptop levels of processing power. Not to mention, they rely on hand tracking only, as they have no other method of input/control. Their hand tracking should be awesome if it was easy to do. It’s not easy, and yet meta headsets make it look easy.
Pancake lenses are downright one of the most important advances in VR headsets lately. If you haven’t tried a headset with pancake lenses, get to an electronics store and do a demo with either the Quest pro or the Quest 3. The clarity is insane, and the total light retained is so high, the screens all of a sudden look 30% brighter. With no distracting artifacts.
The pro controllers optical reckoning is definitely the way of the future. It’s still a little expensive now, so you won’t see it in cheap headsets yet. But it won’t be long before that is just the right way to do controllers. They track just as accurately, except the headset doesn’t need to be able to see them. So you can aim a bow for more than a few seconds, or have your hands down at your sides when relaxing. There are a whole bunch of game mechanics that can’t currently be used in standalone VR games as opposed to PCVR because most controllers can’t be tracked in many locations.
They funded the creation, and took part in the design of -the- mobile XR chipset, and the second one. Almost all headsets use it now, rather than funding one of their own.
The way passthrough works on good headsets, recreating the world around you in 3D from the camera feeds, so that things can be properly occluded by and occlude stuff that is created by the headset. Would again have taken so much longer without access to the processing power at facebook. For now, it isn’t even on the released version of the headset software, it is still in internal testing for the most part, but it’s getting close. And it’s gonna be huge for mixed reality.
There are a bunch more individual pieces of headsets that still either wouldn’t exist or wouldn’t be cheap enough to make yet without meta’s R and D money. And a bunch more coming soon.
Here’s the thing, I hate meta just as much as anyone else. But I want the best hardware, you really can’t get it anywhere else, and it might be a long time before that changes. When it changes, I will buy that instead. I don’t need old games from my old headsets to carry forward to the new one. At the time, I’ll be playing 2 or 3 games at most, and I can choose if I want to buy those for the new headset or finish them on the current headset. Just like old consoles, if I want to play those old games again, I can either play them on the old console/headset or wait until there is a way to emulate old games on the new hardware. Or a remake.
And to address your other point about the exclusives, yes they bought some studios, but they also put a ton of money into other games without requiring the games to be exclusive to Oculus/meta headsets or stores, only to make sure there was at least a version for their headsets and store. Those companies could have made versions or ports for other stores and hardware, and many did. Some didn’t, as those versions didn’t get funded.
I stopped reading when you implied that Facebook invented pancake optics. They have been used in cameras for decades. And while I agree they’re the way forward in the future, saying they let more light in is factually incorrect: they only let about 10-15% of the light through. This page has a good overview of why that is and how they work.
Ok, well I guess I’m sorry I fell into your trap and got one thing wrong. You wanted me to list a bunch of stuff off the top of my head, I’m just some random guy. It doesn’t matter if the rest is right, I failed your test. I assumed since all the headsets with pancake lenses were so much brighter than the old ones without worse battery life that it must let more light through than the previous lens types did. But I guess it was some other advance in some other tech that let them run the screens 10x as bright at the same battery cost.
And I’m not sure what part made it sound like I thought they invented the very idea of pancake lenses. You wanted me to list the innovations they brought to VR. And making pancake lenses affordable for VR is very much a thing they did, by spending money. The money spent went to a company that was making them for other applications. But it was still money spent by meta to bring them to VR.
I may not have all the details perfect. But your take is that meta spending 25 billion dollars on progressing VR was unnoticeable and did nothing…
What? I didn’t want you to list a bunch of things off the top of your head. I asked for one factual thing, and you instead you provided a bunch of assumptions. If you can’t provide actual facts maybe just don’t state guesses like they’re true?
Sorry, I got from your “like what?” response to my statement that they had financed a wide variety of technologies for VR headsets that you wanted me to provide examples, not just one thing.
But when I was wrong about one of the things, it invalidated everything else anyway. So the other examples don’t count anymore.
I didn’t know it was supposed to be my thesis paper either. I’m sorry I didn’t put hours of research into what I thought was a conversation. If you would have asked me to make sure my sources were cited I probably could have put more effort in. And maybe I would have found out that I guessed one thing wrong. One thing that didn’t even actually matter towards the actual topic of the conversation, but that automatically invalidated all the other right things I said.
I drew the wrong conclusion from the screens being brighter, I assumed the lens must have helped with that, I will never make that mistake again. It turns out it was just some other tech advancement I previously didn’t know about, but could have added to the list if I had done all the research you wanted.