By combining the webcam texture and the WebVR functionality of Shadertoy, I was able to create two “Augmented Reality” fragment shaders.

An Augmented Reality (AR) shader in Shadertoy is based on the following technique:

  • The webcam texture is set as the background image of the shader. If the Shadertoy iOS app is used, you will see the input of the front camera of your iPhone.
  • The mainVR function is implemented and renders a 3D scene on top of the background using the ray-origin and ray-direction arguments of the function. If the Shadertoy iOS app is used, the direction and origin are based on the position and orientation of your iPhone using ARKit.

Finally, the shader is published using the “Public+API” option to make it accessible in the iOS app. Now you can view the shader and explore the 3D scene in Augmented Reality.

I created two AR shaders. A first one shows a portal between a virtual world and the “real world”. Buffer A keeps track of the camera-position and calculates if the user enters (or leaves) the portal. Analytical Area Lighting is used to light the virtual scene.

Because not all users have an iOS device (or the app installed), a screen capture of the shader in action is shown below:

Portal

You can find the fragment shader on Shadertoy: https://www.shadertoy.com/view/lldcR8.

A second shader shows the Menger Sponge fractal in Augmented Reality. Again, a screen capture of the shader in action is shown below:

Menger Sponge

You can find the fragment shader on Shadertoy: https://www.shadertoy.com/view/XttfRN.

Similar posts

If you like this post, you may also like one of my other posts:

Augmented Reality and Shadertoy
Tagged on: