Creative Technology Studio - Solarflare Studio https://solarflarestudio.co.uk/ Hack The Human Experience Mon, 09 Jun 2025 08:25:48 +0000 en-GB hourly 1 https://wordpress.org/?v=6.8.1 https://solarflarestudio.co.uk/wp-content/uploads/2025/02/SFS_Favicon_Sun1-1-150x150.png Creative Technology Studio - Solarflare Studio https://solarflarestudio.co.uk/ 32 32 203074953 APRIL 2025 – EMERGING TECH REPORT AI, XR & INTERACTIVE TECHNOLOGY – LATEST DEVELOPMENTS https://solarflarestudio.co.uk/april-2025-emerging-tech-report-ai-xr-interactive-technology-latest-developments/ Fri, 02 May 2025 08:34:23 +0000 https://solarflarestudio.co.uk/?p=30506 April marked a noticeable move toward more embodied, hands-free, and responsive XR experiences. Wearables progressed rapidly with new capabilities from Meta, Google, and Apple. Meanwhile, gesture-based spaces, real-time AI, and adaptive visuals pointed to a future where interaction centres on environment, instinct, and personalisation rather than screens. Here’s what stood out this month – and […]

The post APRIL 2025 – EMERGING TECH REPORT AI, XR & INTERACTIVE TECHNOLOGY – LATEST DEVELOPMENTS appeared first on Solarflare Studio.

]]>

April marked a noticeable move toward more embodied, hands-free, and responsive XR experiences. Wearables progressed rapidly with new capabilities from Meta, Google, and Apple. Meanwhile, gesture-based spaces, real-time AI, and adaptive visuals pointed to a future where interaction centres on environment, instinct, and personalisation rather than screens.

Here’s what stood out this month – and how it’s shaping the next phase of interactive design.

🕶 SMART GLASSES, AI WEARABLES & PERSONAL AR

1 - Meta Ray-Ban Smart Glasses – Live Translation, Visual Search & AI Expansion

What’s new
Meta expanded its live translation and Meta AI features to more EU countries, bringing visual search, object identification, and spoken interaction directly into the user’s eyeline.

Why it matters
This approach establishes wearables as everyday tools – offering instant access to contextual information without needing a phone.

Where this is useful

  • Navigation and wayfinding at events
  • Guided activations or cultural tours
  • Frontline roles needing language support

Our opinion
The market has finally moved from concept to practical implementation. These updates highlight where smart glasses truly excel – delivering subtle, useful guidance within immersive brand experiences. Our current prototypes focus on this precise combination of audio, glance, and gesture without screens. We’re seeing particularly strong results in cultural venue applications, where visitors engage with content while maintaining awareness of their surroundings.

2 - The AR Glasses Race – Meta vs Google vs Apple

What’s new
Google previewed its Gemini-powered glasses at TED 2025, with features like live translation and on-object information. Apple’s Project N50 continues to build momentum, while Meta’s Orion and Ray-Ban lines scale across more markets.

Why it matters
Three tech giants are now visibly committed to AR glasses. That’s meaningful – because it signals investment not just in hardware, but in the ecosystems needed to support them.

Where this is useful

  • Self-guided tours and location-triggered content
  • Smart factory overlays
  • Retail navigation and inventory visibility

Our opinion
We’ve already shifted our R&D to develop activations without handheld devices. These updates confirm that direction. With AR wearables gaining momentum from multiple manufacturers, we predict this technology will transition from niche to standard in experience design within 12–18 months. Our most recent client workshops show increasing appetite for planning these implementations now rather than waiting for mass adoption.

Video Credit: Google @ TED2025

🧠 AI CHATBOTS WITH REAL-TIME WEB + FILE MEMORY

3 - GPT-4.1 Chatbots – Live Web + Brand-Trained Responses

What’s new
The latest OpenAI model blends live web search with internal file memory – allowing chatbots to combine public knowledge with custom brand training.

Why it matters
This creates responsive AI systems capable of answering questions with both live data and organisation-specific context.

Where this is useful

  • Interactive product advisors at live events
  • Voice-based installations
  • Experiences that generate visuals, responses or narration on request

Our opinion
A recent project we delivered employed this approach – training a chatbot on internal content whilst pulling current information in real time. The result functioned as a dynamic interface rather than a simple tool. This technique allows experiences to adapt to guests spontaneously through conversation, voice, or generative visuals. We’re particularly impressed with how this reduces the knowledge maintenance burden for brands with rapidly evolving product lines.

🎟 MULTI-SENSORY IMMERSION & SIMULATED MOTION

4 - Robotic Arm Motion Rides – Synchronized VR for Immersive Attractions

What’s new
New VR setups using industrial robotic arms are allowing real-time physical synchronisation with virtual environments. One notable demo this month showcased a high-speed dragon ride simulation with full-body G-force and motion feedback.

Why it matters
These systems combine precision movement, wind, visuals and sound – offering true multi-sensory immersion for audiences.

Where this is useful

  • High-impact theme park activations
  • Public showcases for sports, film or entertainment
  • Location-based VR storytelling


Our opinion
Our design approach now incorporates movement as a narrative driver rather than merely visual content. This technology creates memorable physical moments, and we’re seeing brands invest in spatial design that balances cinematic quality with tangible sensation. The opportunities for branded entertainment in particular have expanded dramatically – we’ve already begun a pilot project using scaled-down versions for corporate events.

Video Credit – RoboEnter

🧱 INTERACTIVE SPACES & GESTURE TRACKING

5 - Gesture Walls & Spatial Games – Full-Body Interaction

What’s new
Recent activations have shown real-time tracking powering gamified walls and gesture-based painting – enabling guests to physically interact with visuals through movement and posture.

Why it matters
This form of body-led interaction removes the need for handsets or buttons – ideal for open, accessible, drop-in installations.

Where this is useful

  • Fan parks and family-oriented brand zones
  • Experiential education
  • Public art and play spaces


Our opinion
These implementations demonstrate how intuitive, touch-free design has become essential for open public spaces. We’re applying similar configurations to create responsive playgrounds and brand touchpoints where engagement occurs naturally through motion. The removal of technical barriers has dramatically increased dwell time in our recent installations – visitors engage for 2-3 times longer when interactions feel physical rather than digital.

Video Credit: @roelofknol

Video Credit: Remi Bigot

6 - Platform 9¾ Illusions – Magic Through Simple Design

What’s new
Universal Orlando’s Platform 9¾ queue experience demonstrates how simple mirror illusions create convincing magic – guests appear to vanish through a brick wall just like in the Harry Potter films.

Why it matters
It’s proof that powerful storytelling doesn’t require expensive technology – just clever stagecraft, thoughtful design, and strategic misdirection.

Where this is useful

  • Theme park queues and transitions
  • Retail brand experiences
  • Pop-up installations and traveling exhibits
  • Budget-conscious immersive storytelling

Our opinion
We’re consistently inspired by these brilliantly efficient design solutions. Universal’s approach perfectly captures how theatrical techniques often outperform complex technology. With the right staging and psychological understanding, even the simplest mirror trick can create a moment of genuine wonder that guests remember and share. These low-tech but high-impact experiences are often more reliable, maintainable, and ultimately more magical than their high-tech counterparts.

Video Credit: @parkbenchtravel

Video Credit: Remi Bigot

6 - AI-Driven Visual Storytelling – Responsive, Trained Media

What’s new
Installations combining AI-generated content with live sensors, allowing projection visuals to shift based on audience interaction, emotion or movement.

Why it matters
These setups dynamically evolve with the audience, shaping stories based on interactions, emotions, and the situation.

Where this is useful

  • Museums and interactive gallery spaces
  • Art-led activations
  • Responsive theatre or music visuals
  • Branded corporate spaces – reception walls


Our opinion
We’re currently developing adaptive visual systems that convey narratives through rhythm, data and reaction. As brands seek more expressive formats, this technology provides a flexible canvas for style, narrative and atmosphere. Our recent museum collaborations show particular promise where content needs to remain fresh for returning visitors – the systems adapt and evolve rather than simply repeating the same experience.

Video Credit: Emil Lanne

🎓 IMMERSIVE TRAINING & XR SIMULATIONS

7 - Virtual Fire Drills & Safety Simulation in Headsets

What’s new
VR drills are now simulating high-pressure environments like warehouse fires or evacuations – complete with decision-making tasks, sensory prompts, and safety training metrics.

Why it matters
These experiences deliver repeatable, scalable training without real-world risk – especially valuable in compliance-heavy industries.

Where this is useful

  • Logistics, shipping or aviation
  • Factory training and onboarding
  • Energy and defence environments

Our opinion
We’re witnessing steady demand for practical VR that delivers operational benefits. These applications serve as essential tools for reducing training costs, standardising processes, and improving knowledge retention across technical teams. Whilst the Quest & VisionPro headsets offer substantial entertainment value in gaming and fun activations, we consistently see value in these types of enterprise applications for our client needs. 

Video Credit: Zac Duff

🧰 INDUSTRY COMMENTARY & TRENDS

8 - Meta Opens Up Mixed Reality API Access for Quest

What’s new
Meta has released developer access to Scene Understanding APIs for Quest 3 and 3S, providing developers with tools to better anchor and persist virtual content within physical spaces. This update enables applications to map rooms more effectively and remember object positions between sessions.

Why it matters
Persistent spatial anchoring has long been the missing element for creating truly embedded mixed reality experiences. This update addresses that fundamental limitation.

Our opinion
We’ve anticipated stable scene anchoring on Quest for some time – this finally unlocks spatial storytelling that remembers, persists and adapts. We’re already modifying an upcoming installation to incorporate these capabilities, particularly focusing on content that can react to and integrate with a visitor’s home environment rather than simply floating in space.

9 - Humane AI Pin – A Necessary Misstep?

What’s new
TechCrunch evaluated the Humane AI Pin – a wearable AI assistant that’s faced criticism for short battery life, inconsistent usability, and unclear purpose despite its innovative concept and significant investment.

Why it matters
The product highlights the challenges of creating truly useful wearable AI that goes beyond simply shrinking existing interfaces.

Our opinion
Wearable technology requires solving real problems in novel ways. Humane’s early challenges demonstrate that functionality and user experience must align perfectly. For brand applications, clarity of purpose remains paramount. While we admire the ambition, our installations emphasise focused utility over technological novelty – ensuring visitors immediately understand the value proposition.

10 - Meta Reality Labs Layoffs – Supernatural and XR Restructuring

What’s new
Meta has implemented significant staff reductions within Reality Labs, primarily affecting its fitness app Supernatural, whilst maintaining substantial investment in core VR/AR platform development. This represents a strategic refocusing rather than retreat from the XR market.

Why it matters
Even with strong financial backing, the XR industry is undergoing necessary consolidation around proven revenue models.

Our opinion
This restructuring reinforces our approach to designing flexible, cross-platform XR experiences that require minimal deployment resources – ensuring adaptability regardless of ecosystem evolution. We’re advising clients to focus on content that translates across multiple delivery methods rather than becoming overly dependent on specific hardware platforms or app environments. This modular approach ensures longer-term viability even as individual platforms shift.

11 - Smart Home Robotics – From Floor Cleaning to Cat Entertainmentg

What’s new
To wrap up, our insights with a fun one – British researchers at the University of Hertfordshire have proposed 100 innovative applications for domestic robot vacuums beyond floor cleaning – including pet entertainment, plant watering, reminder systems, and ambient environment control through programmable movement patterns.

Why it matters
The research demonstrates how existing technology can be repurposed through creative thinking rather than requiring new hardware development.

Our opinion
This research brilliantly captures effective experiential design principles: inventive, resourceful, and playful. Creative repurposing often support for better results than constant hardware iteration. 

This month’s XR and AI developments reveal a market evolving in three clear directions: wearables delivering practical everyday utility through voice, translation and navigation; experiential design transforming from static presentations to responsive, adaptive systems; and foundational improvements in technologies like GPT-4.1 and Quest APIs enabling capabilities that were previously impossible.

We’re creating experiences that genuinely respond to context, emotion, movement and inquiry – if that approach resonates with your vision, let’s collaborate.

The post APRIL 2025 – EMERGING TECH REPORT AI, XR & INTERACTIVE TECHNOLOGY – LATEST DEVELOPMENTS appeared first on Solarflare Studio.

]]>
30506
MAY 2025 TECH REPORT: GOOGLE I/O, AI FILMMAKING & SMART WEARABLES https://solarflarestudio.co.uk/may-2025-tech-report-google-i-o-ai-filmmaking-smart-wearables/ Wed, 04 Jun 2025 13:00:54 +0000 https://solarflarestudio.co.uk/?p=30607 May delivered a wave of breakthrough tools that blur the lines between AI creation, physical interaction, and consumer reality. Google’s I/O showcase dominated headlines with integrated AI workflows, while smart glasses evolved from prototype to practical. Meanwhile, immersive installations proved that spatial storytelling doesn’t always need complex tech – sometimes it’s about clever perspective and […]

The post MAY 2025 TECH REPORT: GOOGLE I/O, AI FILMMAKING & SMART WEARABLES appeared first on Solarflare Studio.

]]>

May delivered a wave of breakthrough tools that blur the lines between AI creation, physical interaction, and consumer reality. Google’s I/O showcase dominated headlines with integrated AI workflows, while smart glasses evolved from prototype to practical. Meanwhile, immersive installations proved that spatial storytelling doesn’t always need complex tech – sometimes it’s about clever perspective and emotional design.

Here’s what caught our attention this month – and how we see it reshaping interactive experiences.

🤖 GOOGLE I/O 2025 SHOWCASE

Google’s annual developer conference dominated May with a comprehensive AI and XR showcase. The event revealed integrated workflows, practical wearables, and commerce-ready tools that signal AI moving from experimental to essential.

1 - Google Flow – Complete AI Filmmaking Platform

What’s new?

Google unveiled Flow at I/O 2025 – a unified platform combining Veo (video), Imagen (image), and Gemini (text) into one filmmaking interface. Creators can explore over 600 films and their exact prompts via Flow TV, making AI video creation transparent and accessible.

Why it matters

This is the first serious platform designed to make complex multi-modal AI creation accessible to brand teams and creative professionals. It removes technical barriers and makes high-quality video prototyping available to anyone who can write a prompt.

Where it’s useful

  • Client presentations: Create compelling mood films and concept videos in minutes, not days
  • Campaign ideation: Test visual styles and narratives before committing to production budgets
  • Social content: Generate branded short-form videos tailored to platform requirements
  • Stakeholder communication: Show rather than tell when presenting experiential concepts to decision-makers

 

Our opinion

We’re excited by Flow’s transparency – seeing prompts alongside outputs is rare in AI tooling and makes it easier for teams to learn and iterate. This will reshape the industry by democratising high-end creative capabilities, allowing brands to prototype and iterate concepts at unprecedented speed. We see this sparking a new era of creative experimentation where the barrier between idea and execution almost disappears.

2 - Google Veo 3 – Physics-Aware Video Generation

What’s new?

DeepMind’s Veo 3 generates video with realistic physics, sound effects, and dialogue from a single prompt. Key breakthrough: hyperreal water physics and multi-modal synchronisation that keeps visuals, audio, and movement aligned.

Why it matters

This elevates AI video from “interesting toy” to “usable tool.” The physics engine and audio sync mean prototypes can feel cinematic and believable – crucial for pre-sale presentations and creative boards.

Where it’s useful

  • Product demonstration videos with realistic physics
  • Atmospheric brand films for mood and tone-setting
  • Quick environmental storytelling for experiential concepts
  • Social content with believable motion and sound design

 

Our opinion

This is the most impressive text-to-video we’ve seen, and this is obviously only the start. We’re super excited to be working with this around new mediums of creative storytelling. This isn’t going to remove the barrier to making content good – you need this plus other creative skills – but it’s a powerful new tool in the arsenal. Based on AI progress, this is the worst it will ever be, which is still incredible.

3 - Google Android XR Glasses with Gemini AI

What’s new?

Google teased smart glasses with deep Gemini AI integration ahead of I/O 2025. Expected features include real-time translation, environmental understanding, and hands-free contextual assistance.

Why it matters

This technology moves beyond prototype stage to deliver real utility that consumers will use daily. For brands, it opens opportunities for persistent ambient experiences that layer helpful information into physical spaces without requiring app downloads or QR codes.

Where it’s useful

  • Retail flagship stores: Provide product information and personalised recommendations through subtle visual overlays
  • Live events and festivals: Offer real-time navigation, artist information, and contextual content without disrupting the experience
  • Tourism and hospitality: Enable multilingual support and location-based storytelling in hotels, museums, and attractions
  • Training and onboarding: Deliver hands-free, contextual guidance in warehouse, retail, or industrial environments

 

Our opinion

All of this is heading in the direction of wearable technology, which we’ve been highlighting for months. We believe a shift is happening where this will potentially become the norm – ambient, always-on information that feels natural rather than intrusive.

Credit: Android XR

4 - Google AI Try-On Integration

What’s new?

Google’s AI-powered “Try-On” feature integrated directly into Search allows users to visualise clothing across different poses and body types using generative modelling and personalisation.

Why it matters

This integration fundamentally changes customer expectations for online shopping. When try-on experiences become as standard as product photos, brands that don’t offer visual interaction will feel outdated. It also reduces purchase hesitation and return rates – solving real business problems.

Where it’s useful

  • E-commerce optimization: Reduce return rates and increase conversion by letting customers see products on themselves before buying
  • Product launch campaigns: Create interactive reveals where audiences can immediately try new collections
  • Retail store integration: Bridge online and offline with digital mirrors that access full inventory
  • Social commerce: Enable try-on experiences within social media shopping flows for impulse purchases

 

Our opinion

The fact that this is native to Google Search changes everything. No app downloads, no filters – just utility where people already shop. Retail brands should explore integrating their catalogues immediately and test conversion lift versus standard product pages.

👓 WEARABLE TECH BEYOND GOOGLE

5 - Reebok x Innovative Eyewear – AI Sport Glasses

What’s new?

Reebok launched AI-integrated smart eyewear designed specifically for sports performance, featuring high-fidelity speakers, outdoor-tuned amplifiers, and real-time coaching capabilities.

Why it matters

Fitness wearables are evolving into performance tools. This opens new terrain for real-time coaching, biometric analysis, and ambient content delivery during workouts or sport-related brand experiences.

Where it’s useful

  • Immersive fitness experiences and guided workouts
  • Real-time performance coaching in sports activations
  • Audio-guided brand experiences in outdoor settings
  • Hands-free content delivery during physical activities

 

Our opinion

We love the concept of what Reebok are doing and why they’re doing it. We certainly have experience working in fitness, athletics, and sports, and we’re looking forward to continuing to explore wearable technology and eyewear that provides data streams. This biometric and performance data can be fed into interactive, immersive experiences – creating personalised moments that respond to your actual physical state and performance.

Credit: Runway, Cristóbal Valenzuela

5 - AI-ENHANCED FIRST-PERSON CAPTURE

What’s new?

FOOH (Fake Out Of Home) experiences typically require heavy 3D rendering, animation, and compositing into scenes. However, new workflows now combine first-person footage from Meta Ray-Ban glasses with AI-powered generative editing tools. This allows creators to capture authentic moments passively, then enhance with AI post-production in a relatively straightforward process.

Why it matters

This “record now, edit later” model streamlines content production for live environments. It combines authentic capture with AI-driven scene manipulation for agile creative workflows.

Where it’s useful

  • Authentic UGC-style content with post-production polish
  • Live event documentation with cinematic enhancement
  • Travel and lifestyle brand content creation
  • Ambassador programs with professional-grade output

 

Our opinion

This is another angle showing why AI is fast-tracking creative capabilities. We’re excited to continue working with this approach, especially as we’ve had experience working in FOOH before. It’s a game-changer for content creation workflows.

Credit: Runway, Cristóbal Valenzuela

🌊 IMMERSIVE SPACES & VISUAL STORYTELLING

7 - Height-Based Immersive Installations

What’s new?

Viral installations using transparent surfaces and projected visuals create powerful depth illusions. Visitors stand on raised platforms while digital content below induces emotional intensity through clever spatial design and perspective tricks.

Why it matters

This proves that powerful emotional reactions don’t require complex interactive systems. Simple spatial design combined with digital motion and psychological perception can create unforgettable moments.

Where it’s useful

  • Museum exhibits and cultural installations
  • Retail showrooms with memorable brand moments
  • Pop-up experiences focused on emotional impact
  • Event spaces where visitors become part of the story

 

Our opinion

Sometimes you just need really cool-looking content for installations, and this really delivers on that. The user’s emotional response is triggered not by jump scares but by depth illusion and height. It’s a relatively straightforward experience that works at both small and large scales – you can imagine this concept adapted for intimate gallery spaces or massive event installations.

Video Credit – bbanzzak_mom

🧱 INTERACTIVE SPACES & GESTURE TRACKING

8 - Real-Time Light and Sound Installations

What’s new?

Perfect example of low-complexity, high-impact content designed for installations and experiences. This showcases the collaboration between lasers, light, sound, and physical objects with LED strips embedded in specific objects – where everything reacts and lights up in harmony.

Why it matters

This is a perfect use case for real-time, user-driven visuals that relate to audio, light, and human haptic feedback. The technical overhead is minimal, but the output creates dynamic, responsive environments.

Where it’s useful

  • Small-scale events requiring tactile, interactive moments
  • Gallery installations with ambient, responsive lighting
  • Brand experiences that respond to presence and movement
  • Event spaces where physical interaction drives visual storytelling

 

Our opinion

We see these cool experiences bringing together multiple elements – installation/fabrication, lighting, LED strips, integrated tech, motion, audio input – all combining to create real-time presence. This particular space for small-scale events that are really tactile and fun shows how simple tech can create profound interactive moments when thoughtfully orchestrated.

Video Credit – Justin Kittell & James Sartor

9 - HYPERVSN Hologram Displays – Mobile 3D DOOH

What’s new?

HYPERVSN launched the first mobile 3D hologram truck, displaying moving holographic content while driving through major cities – taking high-precision holographic displays to new locations, scenes, and experiences

Why it matters

This is just another step forward in how you can utilize high-precision displays and holographic technology in different environments. It creates fun, interesting, eye-catching visuals that bring immersive media to high-traffic areas without requiring fixed infrastructure.

Where it’s useful

  • Product launches requiring maximum attention
  • Campaign stunts in areas without screen real estate
  • Event marketing with portable, striking visuals
  • Seasonal brand roadshows and touring activations

 

Our opinion

We’ve worked with HYPERVSN on many occasions, and this is just another medium – another way of getting your message, brand, or experience out there in a striking way. It’s portable spectacle that demonstrates the evolution of display technology.

🛍 COMMERCE GETS VISUAL

Photoreal Creator Worlds in Gaming

Gaming platform creators are building photoreal environments that rival traditional game studios, shifting from simple aesthetics to detailed, immersive worlds while maintaining rapid iteration and creator-first economics.

Why it matters

The “YouTube-ification” of gaming is becoming reality. UGC is surpassing AAA production cycles for some content types, meaning brand world-building might shift from major studio production into creator-led ecosystems.

Where it’s useful

  • Youth-focused brand worlds and virtual experiences
  • Product placement in high-engagement gaming environments
  • Community-driven brand storytelling and user-generated content
  • Virtual events and concerts within gaming platforms

 

Our opinion

There’s so much you can do on Roblox, and if you have a specific idea of what type of content you want to create, this platform shows you can reduce barriers and be as creative as you possibly want with the tools – whilst having massive traction, reach, and user engagement throughout. It’s becoming a serious creative platform, not just a game.

Video Credit: Stephen Dypiangco

🧰 INDUSTRY COMMENTARY & TRENDS

11 - 3D Digitisation Technology for Immersive Experiences

What’s new?

Advanced 3D scanning technology creates incredibly detailed, fully explorable 3D models with spatial annotations and real parallax effects that run directly in web browsers.

Why it matters

This technology creates incredibly detailed experiences that run in the browser with low barrier to entry, but you can still get up close and personal with all the information and detail. It’s photoreal fidelity without requiring downloads or special software.

Where it’s useful

  • Museums and galleries for interactive exhibitions
  • Sporting stadiums and behind-the-scenes experiences
  • Event venues and locker room tours
  • Heritage sites and cultural storytelling
  • Product showcases requiring detailed inspection

 

Our opinion

Picture this across installations, museums, galleries, but also sporting stadiums, events, locker rooms, behind-the-scenes experiences – there’s so much you can do. The quality is impressive, and spatial annotations add real editorial depth for any environment you want to digitise and share.

Credit: Arrival.Space

12 - Interactive Experiences for the Web – Pushing Creative Boundaries

What’s new?

Advanced web technologies are enabling breakthrough interactive experiences that create stunning organic visuals driven entirely by code and creative algorithms, with no heavy asset files required.

Why it matters

The latest web technology capabilities allow for real-time creative experiences without asset dependency. This means truly responsive, beautiful visuals using just creative code and the latest browser capabilities – making powerful interactive experiences accessible to any device with a web browser.

Where it’s useful

  • Brand microsites with generative, ambient visuals
  • Web-based installations and interactive galleries
  • Lightweight immersive experiences on mobile
  • Live, responsive websites that adapt to user interaction

 

Our opinion

This shows how far web technology has advanced – you can now create lightweight, fully interactive, and beautiful experiences that work across all devices. It opens up live, generative websites for brands demanding high visual impact with minimal technical overhead.

Credit: Niklas Niehus

May 2025 saw significant advancements in AI and wearable technology.

These technologies are becoming increasingly practical and impactful, transforming how we create, interact, and experience the world. The lines between physical and digital are blurring, and brands are finding new ways to connect with audiences through personalised and immersive experiences.

This progress signals a future where innovation is accelerating, and the possibilities for creative expression and engagement are expanding rapidly.

Now that the dithering was sorted it was time to turn to the main reason for the project the single line that runs through all the points. Because of the quadtree searching for the nearest pint to draw the line through was much faster than Processing. Unfortunately, the line drawing was flickering all over the place, it needed some temporal consistency so I tweaked the code to retain the line path from frame to frame and only update the path if the path can find a more efficient route.

The post MAY 2025 TECH REPORT: GOOGLE I/O, AI FILMMAKING & SMART WEARABLES appeared first on Solarflare Studio.

]]>
30607
MARCH 2025 – EMERGING TECH REPORT AI, XR & INTERACTIVE TECHNOLOGY – LATEST DEVELOPMENTS https://solarflarestudio.co.uk/march-2025-emerging-tech-report-ai-xr-interactive-technology-latest-developments/ Tue, 01 Apr 2025 12:11:07 +0000 https://solarflarestudio.co.uk/?p=30424 March was huge with innovation—from GDC 2025 headlines to major breakthroughs in generative AI, XR design tools, and next-gen AR hardware. We could easily dedicate this entire report to the progress in AI alone, but we’ve focused on the tools that genuinely shift how brands and creative teams can prototype, tell stories, and engage audiences. […]

The post MARCH 2025 – EMERGING TECH REPORT AI, XR & INTERACTIVE TECHNOLOGY – LATEST DEVELOPMENTS appeared first on Solarflare Studio.

]]>

March was huge with innovation—from GDC 2025 headlines to major breakthroughs in generative AI, XR design tools, and next-gen AR hardware. We could easily dedicate this entire report to the progress in AI alone, but we’ve focused on the tools that genuinely shift how brands and creative teams can prototype, tell stories, and engage audiences.

It’s no small task narrowing this down from over 100 updates, but here’s our take on the innovations that matter most this month—with our thoughts on why they’re relevant, what we like, and where we see immediate application.

🤖 AI-POWERED CREATIVE TOOLS

1 - Vibe Coding – Natural Language to Functional Assets

What’s New?
“Vibe coding” is the emerging term for tools that turn plain language into real, usable creative outputs—code, 3D models, UI elements and more.

Why It Matters?
It’s like having a developer or designer working with you in real time. This helps teams move faster in early stages of concepting, and opens up production to people who might not have technical backgrounds.

Where This is Useful:

  • Internal brainstorms or client co-creation

  • Quick mockups or visual ideas

  • Tools for brands to experiment without dev teams

🧱 TEXT-TO-3D & WORLD GENERATION

2 - Generate 3D Assets from Images + Prompts

What’s New?
The Tripo3D tool lets you upload an image, type a short description, and get a 3D object ready to use in Blender—no modelling experience required.

Why It Matters?
Great for quick experimentation. Whether you’re designing a space, building a virtual scene, or planning a product showcase, this takes hours off the process.

Where This is Useful:

  • Concept visuals for events and installs

  • Prototyping XR or game scenes

  • Quick visualisation for client sign-off

3 - Meta Reality Labs – Turn Any Photo Into a Walkable 3D Space

What’s New?
Meta has developed a way to turn a flat image into a full 3D scene you can move through, using AI to fill in missing details.

Why It Matters?
This unlocks quick previews of what a pop-up, space, or environment could feel like—without weeks of build time. It’s a huge time-saver for creative planning.

Where This is Useful:

  • Pre-vis for XR scenes

  • Immersive retail layouts or spatial mockups

  • Turning historical or concept imagery into walkthroughs

🧠 CONTEXTUAL & SPATIAL AI

4 - Spatial LM (Hugging Face) – An AI That Understands Space

What’s New?
This AI model can understand where objects are, how spaces are laid out, and what directions things face. Think of it like a spatially aware assistant.

Why It Matters?
Perfect for XR apps, spatial interfaces, or smart assistants. This lets AI actually “understand” the room—so it can guide users or respond to layout changes in real time.

Where This is Useful:

  • AR navigation or design previews

  • Interactive assistants in physical spaces

  • Smarter in-game characters and guides

🥽 XR & SPATIAL COMPUTING

5 - Samsung XR Headset (Project Moohan) – Android-Based, Open Platform

What’s New?
Samsung’s new XR headset runs on Android, unlike Apple’s closed system—making it more flexible for custom content.

Why It Matters?
This opens the door to more affordable, brand-friendly XR builds without being locked into Apple’s ecosystem. Ideal for multi-platform activations and custom apps.

Where This is Useful:

  • XR content with backend flexibility

  • Event activations needing unique deployment

  • Custom enterprise applications

Video by Ben Geskin

6 - Custom VR Immersion Rigs – Physical Add-ons for VR

What’s New?
We’re seeing home-built rigs that simulate real physical motion—like skydiving or movement-based haptics—to work alongside consumer headsets like Meta Quest.

Why It Matters?
These setups offer inspiration for adding physical interaction to brand experiences—turning a standard VR moment into something unforgettable.

Where This is Useful:

  • Location-based brand experiences

  • Training simulations with physical realism

  • VR arcades or festival installations

Video by SkyAmirV

7 - Snap Spectacles (5th Gen) – Location-Based AR + Hand Tracking

What’s New?
The latest Spectacles update includes GPS-powered AR, better hand gesture controls, and interactive features like built-in scoring or AR keyboards.

Why It Matters?
This pushes AR wearables from novelty to real-world use. Brands can now create hands-free, location-specific experiences that guide users or reward participation—no phone screen needed.

Where This is Useful:

  • On-site AR treasure hunts or tours

  • Brand gamification with leaderboard systems

  • Immersive AR without friction at events

🎮 INTERACTIVE EXPERIENCES

8 - Roblox Egg Hunt – Multi-World Digital Quest

What’s new?
Roblox launched a massive “egg hunt” community event with a $1 million prize pool, themed around Ready Player One-style virtual quests across multiple game worlds.

Why it matters?
This event showcases the evolving sophistication of platform-wide virtual events and the power of shared goals in driving engagement. With its substantial prize pool and cross-world gameplay, Roblox demonstrates how virtual platforms can create cultural moments that rival physical events in scale and excitement. For brands, this approach offers a blueprint for creating meaningful digital activations that leverage existing online communities rather than building destinations from scratch.

The cross-world nature of the event is particularly noteworthy—by integrating challenges across different experiences, Roblox created a cohesive narrative that encouraged exploration while maintaining consistent engagement mechanics. This strategy could translate well to multi-location retail activations or cross-brand partnership campaigns.

Where this is useful:

  • Virtual event design and cross-platform brand activations
  • Gamified loyalty programs with compelling reward structures
  • Community building through shared challenges and goals

9 - Shadow Art by Joon Moon – Shadows as Interfaces

What’s New?
Visitors’ shadows interact with digital content in a projection-mapped installation—no screens, no learning curve.

Why It Matters?
It’s instantly engaging, especially in public or family-friendly spaces. It’s one of the most natural interaction types we’ve seen—intuitive and joyful.

Where This is Useful:

  • Public or cultural activations

  • Museums and galleries

  • Interactive storytelling spaces

Video Credit: Joon Moon

🎭 IMMERSIVE STORYTELLING & WEARABLE TECH

10 - Biotron – Turn Object into Proximity sensors

What’s New?
Biotron is a new device which allows you to turn any conductive object into an interactive sensor. Using a similar tech to touch screens but much more sensitive, in fact you don’t even need to touch the objects for them to detect you. 

By combining dynamic lighting, reactive sound design, and physical structures into one cohesive system. The experience adapts to users in real time—responding to proximity, or movement—creating a layered, immersive environment that feels alive.

Why It Matters?

Using the physical environment as the interactive interface expands the possibilities when creating engaging experienced. More importantly, it puts interaction at the centre, letting brands tell stories proximity, touch and ambience—not screens or menus.

Where This is Useful:

  • Adding interactivity to surprising objects.

  • Works as a plug-and-play MIDI device.
  • Interactive spaces that blend digital and physical elements

11 - ElevenLabs x Dalí Museum – Voice AI in Culture

What’s New?
Visitors to the Dalí Museum can talk into a surreal “lobster phone” and hear replies in Dalí’s voice, generated by AI.

Why It Matters?
It’s weird, respectful, and personal—a great example of voice AI adding atmosphere. The same technique could easily power brand mascots or founder personas in spaces.

Where This is Useful:

  • Voice-driven museum or brand guides

  • Historical or character storytelling

  • Personalised brand moments at scale

Curious about what’s possible? From interactive experiences to AI-driven content creation, new technology is opening up fresh opportunities for engagement. Whether you’re looking to experiment or build something groundbreaking, we’d love to explore ideas with you.

The post MARCH 2025 – EMERGING TECH REPORT AI, XR & INTERACTIVE TECHNOLOGY – LATEST DEVELOPMENTS appeared first on Solarflare Studio.

]]>
30424
FEBRUARY 2025 – EMERGING TECH REPORT AI, XR & INTERACTIVE TECHNOLOGY – LATEST DEVELOPMENTS https://solarflarestudio.co.uk/february-2025-emerging-tech-ai-xr-report/ Sun, 02 Mar 2025 16:29:27 +0000 https://solarflarestudio.co.uk/?p=30161 Emerging innovations in AI tracking and face-swapping are transforming interactive content. This month, real-time motion tracking gets a major boost with TouchDesigner’s new OpenPose plugin, while DeepFaceLab’s latest update makes AI face-swaps smoother and more realistic than ever. Each month, we highlight the latest technological advancements alongside creative experiments that inspire us. This includes new […]

The post FEBRUARY 2025 – EMERGING TECH REPORT AI, XR & INTERACTIVE TECHNOLOGY – LATEST DEVELOPMENTS appeared first on Solarflare Studio.

]]>

Emerging innovations in AI tracking and face-swapping are transforming interactive content. This month, real-time motion tracking gets a major boost with TouchDesigner’s new OpenPose plugin, while DeepFaceLab’s latest update makes AI face-swaps smoother and more realistic than ever.

Each month, we highlight the latest technological advancements alongside creative experiments that inspire us. This includes new AI tools, XR applications, and interactive experiences that can drive brand activations and immersive events. Our goal is to showcase innovations that are relevant for brands looking to push creative boundaries.

Here’s what’s new, how it’s shaping interactive experiences, and where these tools can be applied.

🌀 Immersive Tech (AR, VR & XR)

1 - NBA Tabletop Mode for Vision Pro – AR Basketball Viewing

What’s new?
NBA League Pass has introduced a feature that allows live basketball games to be viewed on a tabletop in AR using Apple Vision Pro, offering a more interactive way to engage with sports content.

Why it matters?
Sports broadcasts are evolving, and this is a perfect example of how AR can enhance engagement. Real-time stats, alternate camera angles, and virtual overlays give fans control over how they experience the game. We see this as a step towards more interactive, data-rich sports viewing that could easily extend to brand activations and sponsorships.

Where this is useful:

  • AR-enhanced sports broadcasting

  • Fan engagement with customisable viewing experiences

  • Branded digital activations for live events

Video Credit: Todd Moyer

2 - Capturing Locations for AR Exploration

What’s new?
A recent project showcased how real-world locations can be scanned and brought into AR, allowing users to explore detailed 3D scenes in augmented space.

Why it matters?
We see this as a powerful way to bring physical spaces into digital activations. Capturing locations in 3D and integrating them into AR offers endless possibilities for virtual tourism, branded experiences, and immersive storytelling. We’ve explored similar techniques in our F1 project, where we scanned the Baku pit garage and created a portal-based AR experience that allowed users to step inside and interact with the environment.

Applications:

  • AR-based location exploration and storytelling

  • Virtual tourism and event previews

  • Branded activations using real-world spaces

Video Credit: Ian Curtis

3 - Vuforia Engine 11 – Advanced AR Features for Enhanced Brand Experiences

What’s new? Vuforia Engine 11 provides improved object tracking, spatial navigation, and cloud-based AR for dependable, scalable solutions.

Why it matters? We see this as a gateway to more reliable interactive experiences. Spatial navigation supports features like wayfinding and scavenger hunts, blending practicality with playful AR engagement.

Applications:

  • Robust AR apps for product demos

  • Interactive retail displays with precise tracking

  • Scavenger hunts and guided tours


 

Video Credit: Vuforia

🎨 Generative AI & Content Creation

4 - Pika AI – Two Major Releases in February 2025

What’s new? Pika AI has had an exciting month with two major updates:

  1. Pika 2.2 introduces Pikaframes, a keyframe transition system enabling smooth video transitions from 1 to 10 seconds. It also supports up to 10 seconds of 1080p resolution video, offering more creative control over AI-generated sequences.

  2. Real-time object and people insertion allows users to seamlessly add new elements into videos without traditional CGI, making AI-driven content creation more dynamic and accessible.


Why it matters?
These updates take AI-generated video content to the next level. Pikaframes provides more cinematic and professional storytelling, while the real-time object insertion feature unlocks new possibilities for interactive marketing, VFX, and personalised branding. This makes high-quality video production more efficient and accessible for brands.

Where this is useful:

  • AI-assisted video production with greater control

  • High-quality social media and brand storytelling

  • Personalised and interactive marketing campaigns

 

Video: Pika Real-time Object Insertion – our experiments

Video: PikaFrames

5 - FLORA – AI-Powered Storytelling & Shot Planning

What’s new? FLORA is a new node-based AI canvas designed for filmmakers. It doesn’t just generate visuals—it analyzes stories and suggests shots based on structure and intent.

Why it matters? By breaking down story beats, suggesting shot ideas, and improving iteration speed, FLORA helps filmmakers and brands create purposeful content while retaining creative control.

Where this is useful:

  • AI-assisted storyboarding & shot planning
  • Faster iteration for film & branded content
  • Structured AI-driven narrative creation

6 - Microsoft Muse – AI-Generated Gameplay from a Single Image

What’s new? Microsoft Muse has introduced a groundbreaking AI model capable of generating entire gameplay sequences from a single image, leveraging real multiplayer game data to build immersive, dynamic experiences.

Why it matters? We see this as a huge step forward in AI-assisted game creation. Imagine brands launching interactive experiences with minimal development time—Muse makes that possible. Whether for promotional gaming experiences, branded storytelling, or interactive retail activations, this technology opens new doors for engagement.

Where this is useful:

  • AI-assisted game prototyping for marketing campaigns

  • Interactive brand storytelling through gaming

  • Rapid content creation for virtual experiences

Learn more about Muse

⚙ Immersive Tools & Real-Time Interaction

7 - TouchDesigner MediaPipe Plugin – Real-Time AI Face & Pose Tracking

What’s new? The latest MediaPipe plugin for TouchDesigner introduces OpenPose rendering, supporting real-time face and pose tracking for Stream Diffusion.

Why it matters? This is a great addition to our tech stack for real-time visual experiences. Whether it’s live events, performances, or interactive installations, brands can now create AI-powered visuals that react instantly to human movement, adding a new layer of engagement.

Where this is useful:

  • Brand activations with interactive visuals
  • AI-generated visuals for live events
  • Real-time reactive content for immersive exhibitions

Video Credit: @blankensmithing

8 - Wemade x NVIDIA ACE AI Boss – Dynamic AI-Powered NPCs

What’s new? Wemade unveils “Asterion,” an AI-powered MMORPG boss that evolves dynamically based on player interactions, using NVIDIA ACE technology.

Why it matters? We love how AI-driven NPCs can create more organic, unpredictable experiences in gaming. Beyond that, AI-powered interactions could also enhance brand storytelling and interactive retail, where digital assistants or virtual brand ambassadors adapt to user behaviour in real time.

Where this is useful:

  • AI-driven character behaviour in gaming & storytelling

  • Interactive digital brand ambassadors

  • Virtual assistants & real-time customer engagement

Video Credit: nvidia

🌐 Interactive Installations & Displays

9 - Weaving Light Tapestry – Laser & Projection Art Installation

What’s new? This installation explores how lasers and projection mapping can be used to weave light into complex, interactive compositions. The interplay of layered visuals and structured lighting techniques creates a rich, multidimensional experience that feels almost tangible. The Weaving Light Tapestry blends digital artistry with physical space, creating a dynamic and immersive experience.

Why it matters? We find this kind of approach exciting because it showcases how light can be used as a design element, not just for spectacle but as a medium for immersive storytelling. The fusion of digital and physical elements opens up fresh possibilities for retail displays, live events, and brand activations. This is a great example of how light-based installations can transform event spaces and retail activations.

Where this is useful:

  • Large-scale brand activations
  • Retail store experiences
  • Interactive art installations

Video Credit: Todd Moyer

10 - Muxwave – Interactive LED Gateway

What’s new? The LANG UK stand features an eye-catching interactive LED gateway built using Muxwave technology. This installation blends large-scale visuals with cutting-edge LED display techniques to create a fully immersive experience.

Why it matters? We love how this pushes the boundaries of digital displays, offering brands a way to create striking, high-impact installations for events, retail spaces, and brand activations.

Where this is useful:

  • High-end retail displays
  • Experiential marketing activations
  • Large-scale event and trade show installations

Video Credit: LANG UK

🔧 Hardware & Tools

11 - Meta Aria Gen 2 – Smart Glasses for AI & XR Research

What’s new? Meta has introduced Aria Gen 2, the latest version of its experimental smart glasses designed for AI, XR, and robotics research. Featuring an advanced sensor suite—including an RGB camera, 6DOF SLAM, eye tracking, microphones, and biometric sensors—these glasses push forward the possibilities of hands-free, context-aware computing.

Why it matters? We’ve already received briefs for glasses-based experiences, and it’s clear that 2025 will see further momentum in this space. Whether for AI-assisted interactions, immersive retail applications, or real-time data overlays, smart glasses will be a critical component of future brand activations.

Where this is useful:

  • AI-powered real-world overlays for retail & navigation

  • Hands-free data access for industrial & creative workflows

  • Next-gen XR experiences for events & brand activations

Learn more about Meta Aria

🧪 Research & Development

12 - Meta for Education – Bringing Quest to Schools and Universities

What’s new? Meta’s initiative integrates Quest VR headsets into educational settings, supplying device management and an array of immersive apps.

Why it matters? We find VR to be a versatile tool for training, workshops, and collaborative projects, offering hands-on learning and skill-building in a virtual setting.

Applications:

  • VR-based corporate training

  • Immersive lessons for schools

  • Virtual collaboration and workshops

Learn more about Meta for Education

13 - 3D Gaussian Splatting – AI-Driven City Simulation

What’s new? 3D Gaussian Splatting is an innovative method that allows AI to generate highly detailed and dynamic real-world environments using point-based rendering. By leveraging hierarchical Gaussian splatting, this technique makes it easy to reconstruct entire cityscapes with remarkable realism and efficiency. Combined with procedural tools like Houdini and AI models like NVIDIA Cosmos, it allows for incredibly fluid and interactive experiences.

Why it matters? We love how accessible and scalable this approach is. Instead of relying on complex 3D modeling techniques, Gaussian splatting offers a lightweight way to create realistic environments with minimal processing power. This makes it ideal for large-scale simulations, immersive brand activations, and even real-time driving experiences where users can navigate and interact with AI-generated cityscapes. The ability to pair this with driving simulations or training modules adds an extra layer of engagement and fun.

Where this is useful:

  • AI-generated cityscapes for virtual events and gaming

  • Real-time urban simulations for architecture and planning

  • Driving experiences and immersive brand activations

Video Credit: Janusch Patas

14 - Meshcapade MoCapade 1.0 – Best-in-Class Markerless Motion Capture

What’s new? Meshcapade has launched MoCapade 1.0, delivering best-in-class markerless motion capture. This system refines 3D motion extraction from a single video, representing a major leap in accuracy and usability.

Why it matters? This is a game-changer for motion-driven experiences. Whether for virtual production, digital fashion, or brand activations, removing the need for tracking suits makes high-quality motion capture more accessible and cost-effective for brands looking to create interactive experiences.

Where this is useful:

  • AI-assisted motion capture for brand activations

  • Virtual production workflows for advertising & film

  • AR/VR content creation for immersive campaigns


Video Credit: Meshcapade

15 - Niantic’s Scaniverse – Exploring 3D Gaussian Splatting on Meta Quest

What’s new? Scaniverse supports photorealistic 3D scanning on devices, incorporating Gaussian splatting for real-time exploration.

Why it matters? It elevates virtual tours with highly detailed environments, enabling robust brand activations and e-commerce displays that feel nearly tangible.

Applications:

  • Real estate or tourism tours

  • Immersive product showcases

  • Educational and museum exhibits

Video Credit: Niantic

Curious about what’s possible? From interactive experiences to AI-driven content creation, new technology is opening up fresh opportunities for engagement. Whether you’re looking to experiment or build something groundbreaking, we’d love to explore ideas with you.

The post FEBRUARY 2025 – EMERGING TECH REPORT AI, XR & INTERACTIVE TECHNOLOGY – LATEST DEVELOPMENTS appeared first on Solarflare Studio.

]]>
30161