360 VR video showcasing

Discussion in 'Mapping Questions & Discussion' started by AI_, Jan 13, 2019.

  1. AI_

    AI_ L1: Registered

    Messages:
    9
    Positive Ratings:
    5
    Hi there, I am the current owner and dev for Jump Academy and am currently working on a project that streamlines map showcasing in 360 VR. The motivation was to use this to showcase jump maps and individual jumps for informative purposes. However, I figured this may be useful for the Source mapping community in general.

    Here is an example output for what I have so far (testing via CS:GO due to the generally better texture quality needed for qualitative evaluation):


    View: https://www.youtube.com/watch?v=PVEXCNy70eU


    Noted, there is still some work needed for stitching the sides of the 3D cube map to blend better.

    This is all done with freely available software (SourceMod, Python, OpenCV, FFmpeg), so there is no need for commercial software like Adobe Premier. The recording can be done in-game, so there is no need to port the map to run in Source Filmmaker either.

    Please let me know if you guys are interested and I will keep you guys posted about the development and releases.
     
    • Like Like x 4
  2. Kobolite

    aa Kobolite Your local dutch person

    Messages:
    647
    Positive Ratings:
    540
    ooh i really like these kinds of videos so i'm interested
     
    • Agree Agree x 1
  3. MegapiemanPHD

    aa MegapiemanPHD Doctorate in Deliciousness

    Messages:
    1,064
    Positive Ratings:
    572
    This looks great, how easy is it to use?
     
  4. AI_

    AI_ L1: Registered

    Messages:
    9
    Positive Ratings:
    5
    The hardest part is probably getting the prerequisites installed and working. It needs Metamod+Sourcemod to run on the local game server, and Python libraries have to be installed for communicating with the local server, and later for image processing.

    After that, this is the procedure so far:

    1. Create the camera paths in-game using the Sourcemod plugin via an in-game menu and drag+drop for path nodes. This can take around 10-15 minutes per map depending on pickiness, and and can be replayed live. There's actually an option to have the camera point in the direction it is traveling along the path for those who just want a regular fly-through video without the VR:


    View: https://www.youtube.com/watch?v=DCdXzvN5rGU


    2. Run the Python recording script while the game is running the map. It will talk to the local server and automatically start/stop recording (using the hotkey for your video recorder, e.g. FRAPS or Nvidia ShadowPlay) for each of the 6 view angles needed for the cube map. The Sourcemod plugin controls the player camera during this whole process. This outputs 6 video files.

    3. Run the Python stitching script given these 6 files as input. Currently this will output all the frames as PNG images after converting them to equirectangular projection that most VR viewers/editing software understand. It currently runs at about 2 seconds per frame on CPU at 4K output resolution on my rig with an Intel i7 3770k clocked at 4.3 GHz.

    4. Run FFmpeg with the frames as input to output the final video at the desired frame and bit rate.
     
  5. AI_

    AI_ L1: Registered

    Messages:
    9
    Positive Ratings:
    5
    You can also skip the entire video recording process if you want it to process just one frame with screenshots of the 6 faces. This would be useful if this site adds support for VR previews in the browser without going through YouTube. There should be public JS libraries for static-image VR.
     
  6. AI_

    AI_ L1: Registered

    Messages:
    9
    Positive Ratings:
    5
    The script now supports calling HLAE to dump the frames. HLAE also allows us to force a non-integer FOV value so we can better align the cube faces. The issue with having different exposure levels for each cube surface was also fixed.



    It's not perfect, but think this is the best I can do for now.
     
    • Like Like x 1
  7. Crash

    aa Crash func_nerd

    Messages:
    3,198
    Positive Ratings:
    5,043
  8. Fantasma

    aa Fantasma

    Messages:
    891
    Positive Ratings:
    1,039
    Looks really cool, very similar to my own method but the major change is using sourcemod instead of SFM which eliminates 90% of the issues my approach has. I'm interested in how this then will impact particles when stitching.
    And for more sharin, here's my blog post I did of the endeavour which may hold something useful. Feel free to msg me on discord if you wanna discuss more about it @AI_ : https://fantasmos.wordpress.com/201...reating-a-4k-360-3d-60fps-video-of-a-tf2-map/
     
  9. AI_

    AI_ L1: Registered

    Messages:
    9
    Positive Ratings:
    5
    Thanks guys. These were actually the videos and tutorials that motivated me to start this project at the beginning.

    Previously (and in my videos above) I also followed the procedure of recording each camera direction as separate videos, then synchronizing the frames together before stitching. But after working on this a bit more, using Sourcemod lets me control the camera direction per-tick and record frames per-tick with HLAE. So at the moment I am experimenting with freezing the camera rig and record through each camera/angle at 1 tick each, then unfreeze it to move forward one tick and repeat. So even if there are particles, we may be able to minimize discontinuity when recording this way.

    My latest experiment with this tick-tock like mechanism allowed me to record in SBS stereo:



    Stereo in VR seems like it would be painful, so I'll defer that for later. I will be redoing the VR video next using this method.
     
    Last edited: Feb 9, 2019
  10. AI_

    AI_ L1: Registered

    Messages:
    9
    Positive Ratings:
    5
    Here is the output in VR using the new method. The stitching is finally seamless now.

    This time the angle for the camera rig moves in the direction of travel instead of the usual axis lock, so there's less neck turning involved when viewing with VR goggles.



    What do you guys think? Do you prefer this or axis lock?
     
  11. AI_

    AI_ L1: Registered

    Messages:
    9
    Positive Ratings:
    5
    Here's the normal axis aligned VR output with the new method. The stitches are also seamless here now.

     
  12. AI_

    AI_ L1: Registered

    Messages:
    9
    Positive Ratings:
    5
    Here's what it looks like after re-enabling particles and moving things like clouds, trees, and chicken:



    Unfortunately, it turns out letting 6 ticks go by per camera snapshot is still quite significant because everything else ends up moving 6x faster than normal.

    Also testing the video bitrate, it looks like upwards of 360 Mbps is needed to not look like utter crap at lower resolutions after YouTube's processing. The video in the previous post was at 250 Mbps and looks very distorted going from 4k to even just 1440p. VR seems very sensitive to video compression artifacts.
     
    Last edited: Feb 10, 2019
  13. Da Spud Lord

    aa Da Spud Lord L0: Crappy Member

    Messages:
    938
    Positive Ratings:
    644
    "host_timescale 0.17"?
     
  14. AI_

    AI_ L1: Registered

    Messages:
    9
    Positive Ratings:
    5
    Unfortunately, no. The game's tick rate does not change with that, so we end up with the same problem.

    I tried changing the camera angles between ticks, but the engine does not support this. But this might be possible on HLAE's end with STV/GOTV demo playback instead of a live server. I'll do some experimenting with this.