Using Gyro-Data as tracking data (e.g. from Sony A7S III)
Newer Cameras like the A7S III record not only the Video footage but also gyroscopic data for stabalizing in their own Catalist Browse Software. Now when the camera has data to how it moves within a room, one could think about using this data to create an accurate virtual camera within after effects.
So instead of software tracking in After effects it uses the actual Gyro-Data. This could be more accurate and way quicker than tracking.
An iPhone app that does similar things is called CamTrackAR. I would love to see this elevated to professional cameras ;-)
In an effort to improve communications between you and the After Effects team, we are moving this feedback forum to the After Effects page on the Adobe Support Community. As requested, bugs and feature requests will be able to live separately in that forum. If you have an item here that did not make the migration, please bring it over to our new home.
See you soon on the Adobe Support Forum!
Nate Foster commented
Sony A7S III is doing stabilization very similarly to how SteadXP does it. I believe GoPro's methods are very similar as well. but they are all 2D fixes. Getting access to that data to feed into and drive After Effect's Warp Stabilizer would be ideal and a VERY welcome one. Getting camera position data to create a virtual camera is a totally different thing all together.
While this would be fantastic to have, this is camera software, camera accessory, and hardware limitation issue not an After Effects issue. CameraTrackAR is a fun tool that I've used with mixed results, and would be very curious to try it with a lidar enabled iPhone to see if there's improvements. I've been in search of this from camera developers for over a decade once i found that many cameras already had accelerometers and/or gyroscopes inside them. If you look at .R3D files in Redcine, buried in the Metadata panel, there's a column that has accelerometer data. I've been trying to access this since 2009. I sat down with the lead software developer for RED at the time to ask to get access to this data through the SDK so tracking software could use it as a hint or seed path for motion tracking. To the best of my knowledge, they've never heard of anyone asking for this.
the whole virtual production wave going on right now is exciting for me because they've figured out that you need to find the camera's position and track it live in order for the technology to work. Having that data pass through the pipeline will give postproduction and immense resource to pull from. I'm hoping that camera manufactures start realizing this and begin implementing solutions.