I created the world's first 360° spatial AI platform on Apple Vision Pro - built natively for visionOS, never owned the device [P]
![I created the world's first 360° spatial AI platform on Apple Vision Pro - built natively for visionOS, never owned the device [P]](/_next/image?url=https%3A%2F%2Fpreview.redd.it%2Fsy5m7v20smwg1.png%3Fwidth%3D140%26height%3D72%26auto%3Dwebp%26s%3Da377394870199881215fc57129e819bb3ed331a1&w=3840&q=75)
| I want to show you something nobody has ever seen before. Three months ago I had zero coding knowledge. I couldn't write a single line of code. In the time since, I taught myself GitHub, Visual Studio, Xcode, Android Studio, Firebase, Firestore, Vercel, Sentry - and built a fully functional AI platform live across web, iOS, Android, Mac desktop, and Apple Vision Pro. I have spent approximately 3 months spending 16 hours a day working on this project to get it to where it was on web, android and iOS. Today I converted it into something completely new. Asksary is a world-first fully spatial AI experience - built natively for visionOS. Not an iPad app running in compatibility mode. A ground-up, native spatial build where the entire interface is a live immersive 360° wallpaper. You don't open the app. You step inside it with realtime voice chat with OpenAI WebRTC with 8 voices with near zero latency too. In the first screenshot you'll see GPT-5 greeting you from inside the spatial environment - all happening inside a 360° world with floating UI, particle effects, and a starfield you're literally standing in. The screenshot shows how Realtime voice chat looks like. Put on the Vision Pro. Change the 360 spatial experience background and chat with OpenAI with near zero latency in realtime. It currently runs GPT-5.2, Claude Sonnet 4.6, Grok 4, Gemini Ultra, DeepSeek R1, 01 Pro 30 live interactive wallpapers and themes. Each one is a different world to inhabit while you work. Beyond the spatial shell, the platform includes:
I wanted to build something that made people say wow. Something nobody had done. I think this might be it. I did this without ever having a Vision Pro at hand to help me develop the concept. So I've never experienced it for myself but I have a pretty good imagination to what it would be like. This version of the Apple Vision Pro variant is not currently available on the App Store but if people are genuinely interested I'll release it soon enough. Would love to hear what you think of the whole idea. It's a fully working model, so not a prototype or demo either. [link] [comments] |
Want to read more?
Check out the full article on the original site