Metaverse as a Service: Building Virtual Business Platforms in 2023
I spent three months last year evaluating metaverse platforms for a client in the healthcare training space. We were deciding between building on Decentraland, rolling our own WebGL environment, or using a platform-as-a-service approach. What I learned: the metaverse market in 2023 is a collection of disconnected experiments, not a coherent platform ecosystem.
This is a technical breakdown of what “Metaverse as a Service” actually means, what businesses are building, and how to evaluate whether it’s right for your use case.
What MaaS Actually Means
The “as a Service” suffix gets attached to everything, but for the metaverse it maps to a real business model: companies building virtual world infrastructure that other businesses can customize and deploy without building from scratch.
The analogy to SaaS works in some ways: you’re using someone else’s infrastructure, customizing the interface and content, and paying per usage or subscription. The difference is that the “software” is a 3D environment that requires specialized development skills to customize.
The key players in this space in 2023:
- Spatial — virtual collaboration spaces, used by enterprises for remote teamwork
- Virbela — virtual campus and event platform, strong in education and conferences
- Engage — enterprise metaverse platform, used for training and corporate events
- Mona — virtual world creation platform, more creator-focused
- ** Decentraland / The Sandbox** — decentralized worlds, more consumer/gaming oriented
Each targets different use cases. Spatial is collaboration. Engage is corporate training. Decentraland is speculative virtual real estate. Mixing them up leads to bad architecture decisions.
The Technical Architecture
A Metaverse-as-a-Service platform has these layers:
┌─────────────────────────────────────────────┐
│ Client Layer │
│ (WebGL/Three.js, VR headset app, mobile) │
├─────────────────────────────────────────────┤
│ Virtual Environment Layer │
│ (3D world rendering, physics, avatars) │
├─────────────────────────────────────────────┤
│ Realtime Communication │
│ (WebRTC, Photon, Liveblocks, Ably) │
├─────────────────────────────────────────────┤
│ Application Backend │
│ (User management, analytics, content) │
├─────────────────────────────────────────────┤
│ Infrastructure Layer │
│ (Cloud hosting, CDN, asset storage) │
└─────────────────────────────────────────────┘
The WebGL Approach
If you’re building a web-based metaverse experience, Three.js is the foundation most teams use:
// Basic Three.js scene setup for a metaverse space
import * as THREE from 'three';
import { OrbitControls } from 'three/examples/jsm/controls/OrbitControls.js';
import { GLTFLoader } from 'three/examples/jsm/loaders/GLTFLoader.js';
class VirtualSpace {
constructor(container) {
this.scene = new THREE.Scene();
this.camera = new THREE.PerspectiveCamera(
75,
window.innerWidth / window.innerHeight,
0.1,
1000
);
this.renderer = new THREE.WebGLRenderer({
antialias: true,
alpha: true
});
this.renderer.setSize(window.innerWidth, window.innerHeight);
this.renderer.outputEncoding = THREE.sRGBEncoding;
this.renderer.toneMapping = THREE.ACESFilmicToneMapping;
this.renderer.toneMappingExposure = 1.2;
container.appendChild(this.renderer.domElement);
// Lighting for realistic rendering
const ambientLight = new THREE.AmbientLight(0xffffff, 0.4);
this.scene.add(ambientLight);
const directionalLight = new THREE.DirectionalLight(0xffffff, 0.8);
directionalLight.position.set(5, 10, 7);
this.scene.add(directionalLight);
// Orbit controls for navigation
this.controls = new OrbitControls(this.camera, this.renderer.domElement);
this.controls.enableDamping = true;
this.controls.dampingFactor = 0.05;
this.camera.position.set(0, 2, 10);
this.loadEnvironment();
this.animate();
}
loadEnvironment() {
const loader = new GLTFLoader();
// Load pre-built environment or avatar space
loader.load(
'/assets/environments/conference-room.glb',
(gltf) => {
this.scene.add(gltf.scene);
// Optimize for rendering
gltf.scene.traverse((child) => {
if (child.isMesh) {
child.castShadow = true;
child.receiveShadow = true;
}
});
},
(progress) => {
console.log(`Loading: ${(progress.loaded / progress.total * 100).toFixed(1)}%`);
},
(error) => {
console.error('Environment load failed:', error);
}
);
}
addAvatar(playerData) {
// Avatar representation in 3D space
const avatarGeometry = new THREE.CapsuleGeometry(0.3, 0.8, 4, 8);
const avatarMaterial = new THREE.MeshStandardMaterial({
color: playerData.color || 0x3399ff
});
const avatar = new THREE.Mesh(avatarGeometry, avatarMaterial);
avatar.position.set(playerData.x || 0, 0.7, playerData.z || 0);
avatar.name = playerData.id;
this.scene.add(avatar);
}
animate() {
requestAnimationFrame(() => this.animate());
this.controls.update();
this.renderer.render(this.scene, this.camera);
}
}
// Initialize
const space = new VirtualSpace(document.getElementById('canvas-container'));
Real-time Multiplayer: The Hard Part
A static 3D environment is a tech demo. A metaverse is a shared space with multiple users. That’s where the complexity explodes.
// WebRTC signaling for multiplayer
import { io } from 'socket.io-client';
class MetaverseClient {
constructor(serverUrl, roomId, userData) {
this.socket = io(serverUrl, {
transports: ['websocket'],
upgrade: false
});
this.roomId = roomId;
this.userData = userData;
this.peerConnections = new Map();
this.localStream = null;
this.setupSignaling();
}
setupSignaling() {
this.socket.on('connect', () => {
console.log('Connected to metaverse server');
this.socket.emit('join-room', {
roomId: this.roomId,
userData: this.userData
});
});
// Handle new user joining
this.socket.on('user-joined', async (userData) => {
const pc = await this.createPeerConnection(userData.id);
this.peerConnections.set(userData.id, pc);
// Send offer to new peer
const offer = await pc.createOffer();
await pc.setLocalDescription(offer);
this.socket.emit('offer', {
targetId: userData.id,
offer: pc.localDescription
});
});
// Handle offer from existing peer
this.socket.on('offer', async ({ offer, senderId }) => {
const pc = await this.createPeerConnection(senderId);
this.peerConnections.set(senderId, pc);
await pc.setRemoteDescription(offer);
const answer = await pc.createAnswer();
await pc.setLocalDescription(answer);
this.socket.emit('answer', {
targetId: senderId,
answer: pc.localDescription
});
});
// Handle answer
this.socket.on('answer', async ({ answer, senderId }) => {
const pc = this.peerConnections.get(senderId);
if (pc) {
await pc.setRemoteDescription(answer);
}
});
// Handle ICE candidates
this.socket.on('ice-candidate', async ({ candidate, senderId }) => {
const pc = this.peerConnections.get(senderId);
if (pc) {
await pc.addIceCandidate(candidate);
}
});
// Handle position updates (for avatar sync)
this.socket.on('position-update', ({ userId, position, rotation }) => {
this.updateAvatarPosition(userId, position, rotation);
});
}
async createPeerConnection(peerId) {
const pc = new RTCPeerConnection({
iceServers: [
{ urls: 'stun:stun.l.google.com:19302' },
{ urls: 'stun:stun1.l.google.com:19302' }
]
});
// Add local audio/video if available
if (this.localStream) {
this.localStream.getTracks().forEach(track => {
pc.addTrack(track, this.localStream);
});
}
// Handle incoming tracks
pc.ontrack = (event) => {
this.handleIncomingTrack(peerId, event);
};
// Send ICE candidates to signaling server
pc.onicecandidate = (event) => {
if (event.candidate) {
this.socket.emit('ice-candidate', {
targetId: peerId,
candidate: event.candidate
});
}
};
pc.oniceconnectionstatechange = () => {
if (pc.iceConnectionState === 'disconnected') {
console.log(`Peer ${peerId} disconnected`);
this.handlePeerDisconnect(peerId);
}
};
return pc;
}
broadcastPosition(position, rotation) {
this.socket.emit('position-update', {
roomId: this.roomId,
position,
rotation
});
}
updateAvatarPosition(userId, position, rotation) {
const avatar = this.scene.getObjectByName(userId);
if (avatar) {
avatar.position.set(position.x, position.y, position.z);
avatar.rotation.y = rotation.y;
}
}
cleanup() {
this.peerConnections.forEach(pc => pc.close());
this.peerConnections.clear();
this.socket.disconnect();
}
}
What Businesses Are Actually Building
The use cases that have real traction:
Corporate Training and Simulations
This is where the money is. Companies using metaverse platforms for:
- Safety training simulations (hazardous environment practice)
- Equipment operation training (virtual machinery, no risk)
- Customer service training (role-play scenarios)
- Compliance training (interactive decision trees)
The ROI calculation is simple: a flight simulator for a $50M aircraft costs less to build in VR than the alternative. Medical training, industrial equipment, hazardous scenarios — these all have expensive real-world training alternatives.
# Example: Training analytics backend
def track_training_session(session_data):
"""
Track user performance in a metaverse training session.
Store in time-series database for analysis.
"""
session_record = {
'session_id': session_data['id'],
'user_id': session_data['user_id'],
'scenario_id': session_data['scenario_id'],
'start_time': session_data['start_time'],
'end_time': session_data['end_time'],
'performance_score': session_data['score'], # 0-100
'critical_decisions': session_data['decisions'], # List of decisions made
'hazards_encountered': session_data['hazards'],
'completion_status': session_data['completed']
}
# Store for analysis
timeseries_db.insert('training_sessions', session_record)
# Real-time alerting for poor performance
if session_data['score'] < 50:
notify_manager(session_data['user_id'], 'training_at_risk')
Virtual Events and Conferences
The pandemic accelerated this. Companies that pivoted to virtual events discovered that some events work better in 3D — particularly when:
- Networking is a primary value (random encounters in virtual space feel more natural)
- Exhibitors need hands-on demos (3D product interaction)
- Workshops require collaborative work (breakout rooms with shared tools)
The post-pandemic question is whether this sticks. Early data: events under 200 people work well virtually. Events over 500 people still benefit from in-person, but virtual hybrid is becoming expected.
Virtual Showrooms and Product Visualization
B2B companies selling complex products (industrial equipment, medical devices, architectural services) use metaverse platforms for:
- Interactive product demos that prospects can explore independently
- Configuration tools where customers build custom specs in 3D
- Remote sales support without travel
Healthcare and Telemedicine
The emerging use case: virtual consultations for mental health (VR exposure therapy), physical therapy (guided exercise in VR), and medical training (surgical procedure practice).
The Technical Challenges Nobody Talks About
VR Headset Adoption
The Meta Quest 2/3 is the most deployed consumer headset. But enterprise adoption of VR is still low. The 2023 reality: your metaverse platform needs a web-based fallback that works without a headset. If you build only for VR, you serve maybe 5% of your potential audience.
Latency and Networking
Real-time multiplayer in 3D is unforgiving. 100ms latency makes a virtual conversation feel like a walkie-talkie. 30ms is acceptable. 10ms feels natural.
Getting consistent low latency requires:
- Edge deployment (your 3D assets need to be close to users)
- Optimistic prediction (client predicts movement, reconciles with server)
- Authoritative server for critical state (inventory, scores)
3D Asset Pipeline
Every metaverse platform has the same bottleneck: 3D artists can’t produce content fast enough to fill virtual worlds. Solutions being tried:
- Procedural generation (AI-generated environments)
- User-generated content (creator tools)
- Asset marketplaces (pre-built 3D models)
- Photogrammetry (converting real-world scans to 3D)
Accessibility
3D spaces are hostile to users with vestibular disorders, visual impairments, and certain motor disabilities. Building an inclusive metaverse requires:
- Non-VR fallback modes
- Color blind-friendly palettes
- Keyboard-only navigation options
- Audio cues and descriptions for visually impaired users
Evaluating Platform Choices
| Platform | Best For | Technical Depth | Customization | Cost |
|---|---|---|---|---|
| Spatial | Meetings, collaboration | Medium | SDK-based | Subscription |
| Engage | Training, enterprise events | High | Full API | Subscription |
| Virbela | Campus, education | Medium | Template-based | Subscription |
| Custom WebGL | Full control, unique experiences | Very High | Everything | Development cost |
| Decentraland | Public, open, speculative | Medium | Smart contracts | Land + gas |
For most businesses, a platform-based approach (Spatial, Engage) is more cost-effective than custom development. Custom WebGL is for companies with unique requirements, sufficient budget, and 6+ months of development time.
What Changed Recently (2024-2026)
The metaverse landscape shifted significantly between 2023 and 2026. Here’s what actually landed:
Hardware matured: Quest 3 and Apple Vision Pro. Meta Quest 3 (released October 2023) and Quest Pro 2 (2024) brought improved passthrough, higher resolution displays, and notably better hand tracking to enterprise VR. The Quest 3 became the dominant standalone VR device for enterprise deployments. Apple Vision Pro (February 2024, $3,499) targeted enterprise and creative professionals — visionOS 2.x brought improved productivity features, but the price point limits mass adoption. For most enterprise metaverse work, Quest 3 remains the primary target.
WebXR became the cross-platform standard. All major browsers now support the WebXR Device API as of 2024. This means your web-based metaverse experience works on Quest (via the built-in browser), HoloLens, and desktop without separate builds. If you’re building a web-first metaverse, WebXR is the answer. Check support in code:
// Check WebXR support in browser
navigator.xr ? console.log("WebXR supported") : console.log("Not supported")
A-Frame for rapid WebXR prototyping. If Three.js is too low-level for your team, A-Frame provides a declarative framework for WebXR that’s much faster to iterate with:
<script src="https://aframe.io/releases/1.5.0/aframe.min.js"></script>
<a-scene>
<a-box position="-1 0.5 -3" rotation="0 45 0" color="#4CC3D9"></a-box>
<a-sphere position="0 1.25 -5" radius="1.25" color="#EF2D5E"></a-sphere>
<a-cylinder position="1 0.75 -3" radius="0.5" height="1.5" color="#FFC65D"></a-cylinder>
<a-plane position="0 0 -4" rotation="-90 0 0" width="4" height="4" color="#7BC8A4"></a-plane>
<a-sky color="#ECECEC"></a-sky>
<a-camera><a-cursor></a-cursor></a-camera>
</a-scene>
Unity 6 and Unreal Engine 5.4 improved VR tooling. Both engines shipped improved XR development tools, real-time ray tracing, and Nanite/Lumen for photorealistic environments. If you’re building native standalone VR apps (Quest, SteamVR), Unity with the XR Interaction Toolkit is the mature path:
GameObject hierarchy for XR Rig:
- XR Origin (XR Interaction Toolkit)
- Camera Offset
- Main Camera
- Left Controller (XRController)
- Right Controller (XRController)
- Direct Interactor
- Cube (with XRGrabInteractable component)
React-Three-Fiber + React-XR for React teams. If your team lives in the React ecosystem, @react-three/xr with @react-three/fiber provides a React-native way to build WebXR experiences:
npm install @react-three/fiber @react-three/drei @react-three/xr
import { Canvas } from '@react-three/fiber'
import { XR, createXRStore } from '@react-three/xr'
import { Box, OrbitControls } from '@react-three/drei'
const xrStore = createXRStore()
function App() {
return (
<>
<button onClick={() => xrStore.enterVR()}>Enter VR</button>
<Canvas>
<XR store={xrStore}>
<ambientLight />
<Box args={[1, 1, 1]} />
</XR>
<OrbitControls />
</Canvas>
</>
)
}
AWS Sumerian is gone; SimSpace Weaver and IoT TwinMaker are in. AWS Sumerian was deprecated in 2023. The replacements: Amazon SimSpace Weaver (2023) handles large-scale real-time 3D simulation, and AWS IoT TwinMaker handles digital twin applications with IoT-connected 3D models. If you were evaluating Sumerian for enterprise digital twins, those two services are where AWS is investing.
Enterprise adoption in 2024-2025 was real but selective. Virtual meeting spaces (Microsoft Mesh, Horizon Workrooms, Spatial) saw significant enterprise adoption for remote collaboration. Training simulations — particularly for hazardous environments, medical procedures, and equipment operation — continued to be the strongest ROI case. Digital twin implementations also grew, particularly in manufacturing and logistics.
Gotchas That Will Sink Your Metaverse Project
Motion sickness is the number one risk. Frame drops below 72fps, excessive motion, or incorrect interpupillary distance (IPD) cause nausea within minutes. Test on real hardware. Emulators do not accurately simulate the nausea that comes from latency or dropped frames in VR.
Platform fragmentation is painful for native apps. WebXR works everywhere. But if you build a native Unity or Unreal app, you need separate builds for Quest, SteamVR, and Windows Mixed Reality. Each platform has its own submission pipeline, policies, and certification process. Budget an extra 2-3 months for multi-platform native builds.
Apple Vision Pro is a walled garden. visionOS apps must be built with RealityKit or Unity with PolySpatial. Traditional Three.js or Unity 3D frameworks require adaptation. If you need Vision Pro support, that’s a separate development track — don’t assume your existing WebXR code will work.
VR captures sensitive biometric data. Quest Pro captures eye tracking and facial expressions. This is GDPR-relevant data. Comply with platform privacy policies and your own privacy policy. Users need to consent to biometric data collection explicitly.
WebGL performance ceiling. Browser-based WebGL has lower performance than native apps. If you’re building complex 3D scenes (thousands of objects, real-time physics, large multiplayer spaces), you’ll hit a performance wall in the browser that doesn’t exist in native. Plan accordingly.
Accessibility is not optional. VR fundamentally excludes users with vestibular disorders, limited mobility, or visual impairments. If accessibility matters for your use case (and it should), you need a non-VR interface as a first-class fallback, not an afterthought.
The Honest Assessment
Metaverse-as-a-Service is real but niche. The use cases that make financial sense are:
- Training simulations with expensive real-world alternatives
- Remote collaboration where casual interaction matters
- Product visualization for complex B2B sales
- Virtual events with strong networking components
The use cases that don’t yet make financial sense:
- Consumer-facing virtual retail (low conversion rates, high development cost)
- General “metaverse presence” for brand building (vapor until proven ROI)
- Internal meetings (video calls work fine)
If you’re evaluating this for a client or your own business: start with the use case, not the technology. If the use case is compelling enough, the metaverse platform becomes a build-vs-buy decision. If the use case is “we should be in the metaverse,” don’t spend the money.
The posts on SaaS vs PaaS decision making and enterprise app development cover the architectural tradeoffs that apply to any platform-as-a-service evaluation, including metaverse platforms. For the broader AI-driven product landscape, the time management with ML post covers a different angle of applying AI to business workflows.
Comments