VR/MR Archives - AEC Magazine https://aecmag.com/vr-mr/ Technology for the product lifecycle Tue, 25 Jul 2023 05:49:58 +0000 en-GB hourly 1 https://wordpress.org/?v=6.2.2 https://aecmag.com/wp-content/uploads/2021/02/cropped-aec-favicon-32x32.png VR/MR Archives - AEC Magazine https://aecmag.com/vr-mr/ 32 32 Gamma AR integrates with BIMcollab https://aecmag.com/vr-mr/gamma-ar-integrates-with-bimcollab/ https://aecmag.com/vr-mr/gamma-ar-integrates-with-bimcollab/#disqus_thread Thu, 29 Jun 2023 13:26:10 +0000 https://aecmag.com/?p=17979 Enables ‘seamless synchronisation’ of site issues identified with Gamma AR to the BIMcollab platform.

The post Gamma AR integrates with BIMcollab appeared first on AEC Magazine.

]]>
Enables ‘seamless synchronisation’ of site issues identified with Gamma AR to the BIMcollab platform.

Gamma AR, a tool that brings BIM models to the job site through augmented reality, is now integrated with BIMcollab, the issue management and model validation software.

The integration allows BIMcollab users to integrate issues created in Gamma AR with their existing workflows.

According to Gamma AR, the integration helps firms address several ‘pain points’ – mistakes and subsequent rework, as well as duplicative documentation and follow-up tasks.

“With Gamma AR’s augmented reality integrated into BIMcollab, our users now have the possibility of on-site checking their models against reality. This new integration shows our commitment to open our platform for a broad range of connections and to support our users in building any workflow they need,” said Erik Pijnenburg, CEO and founder of BIMcollab.

Gamma AR uses phones or tablets to overlay BIM models on the construction site, to help show what to build, cut, drill, and where or for ‘efficient and accurate’ quality control.

Gamma AR also offers integrations with Autodesk Construction Cloud / Autodesk BIM 360 and Procore.

The post Gamma AR integrates with BIMcollab appeared first on AEC Magazine.

]]>
https://aecmag.com/vr-mr/gamma-ar-integrates-with-bimcollab/feed/ 0
Arkio 1.5 enhances Revit and Rhino workflows https://aecmag.com/vr-mr/arkio-1-5-enhances-revit-and-rhino-workflows/ https://aecmag.com/vr-mr/arkio-1-5-enhances-revit-and-rhino-workflows/#disqus_thread Mon, 24 Apr 2023 16:01:14 +0000 https://aecmag.com/?p=17583 Latest release streamlines export speed and CAD to VR workflows

The post Arkio 1.5 enhances Revit and Rhino workflows appeared first on AEC Magazine.

]]>
Arkio 1.5, the latest release of the collaborative spatial design tool, streamlines export speed and CAD to VR workflows

Arkio has introduced a new release of its collaborative spatial design tool focused on speed and workflows. According to the Icelandic company, Arkio 1.5 can now export large models from Revit and Rhino up to 10x faster.

In addition, users can now upload directly to Arkio on Quest and other mobile devices using Arkio Cloud, all from inside Revit and Rhino.

Version 1.5 also includes a number of performance and visual enhancements with better shadows and ambient occlusion on all supported devices.

Arkio now defaults to a smooth spectator camera when using PC VR and has improved the section tool to better support cutting of transparent geometry.

This release also includes a number of import, section and UI bug fixes.

Arkio allows AEC professionals to design interiors, sketch buildings and craft environments with their hands and collaborate using VR, desktop and mobile.

The post Arkio 1.5 enhances Revit and Rhino workflows appeared first on AEC Magazine.

]]>
https://aecmag.com/vr-mr/arkio-1-5-enhances-revit-and-rhino-workflows/feed/ 0
Resolve breaks down model size barriers on standalone VR headsets https://aecmag.com/vr-mr/resolve-breaks-down-model-size-barriers-on-standalone-vr-headsets/ https://aecmag.com/vr-mr/resolve-breaks-down-model-size-barriers-on-standalone-vr-headsets/#disqus_thread Wed, 23 Nov 2022 15:27:28 +0000 https://aecmag.com/?p=16107 Collaborative design review software also adds AR capabilities through support for Meta Quest Pro

The post Resolve breaks down model size barriers on standalone VR headsets appeared first on AEC Magazine.

]]>
Collaborative design review software also adds AR capabilities through support for Meta Quest Pro

Resolve, an XR-based collaborative design review tool for AEC projects, has introduced the ‘Wellington Engine’, a custom 3D engine designed to render huge BIM models with millions of polygons on standalone virtual reality devices.

This includes the new Meta Quest Pro, which also enables Resolve to add Augmented Reality (AR) capabilities to the software by taking advantage of the headset’s colour passthrough cameras.

Resolve’s ‘Wellington Engine’ is said to use ‘cutting edge’ techniques in virtualized geometry, occlusion culling, and adaptive partitioning, to enable the review of complex files without having to stream data from an external workstation/server or devoting hours to model clean up.

Resolve offers integration with Autodesk Construction Cloud.

Companies can link their account to Resolve and models will automatically update without any manual exports.

Resolve also connects with project management tools and syncs comments made in VR to external issue trackers without extra work.

The post Resolve breaks down model size barriers on standalone VR headsets appeared first on AEC Magazine.

]]>
https://aecmag.com/vr-mr/resolve-breaks-down-model-size-barriers-on-standalone-vr-headsets/feed/ 0
How architects can help the Metaverse live up to its hype https://aecmag.com/vr-mr/how-architects-can-help-the-metaverse-live-up-to-its-hype/ https://aecmag.com/vr-mr/how-architects-can-help-the-metaverse-live-up-to-its-hype/#disqus_thread Mon, 21 Nov 2022 16:50:05 +0000 https://aecmag.com/?p=15943 By Roderick Bates, head of integrated practice, Enscape | Part of Chaos

The post How architects can help the Metaverse live up to its hype appeared first on AEC Magazine.

]]>
By Roderick Bates, head of integrated practice, Enscape | Part of Chaos

The metaverse is set to go main stream, with the potential to be a $200 billion market by 2024, according to an estimate by analysts at Bloomberg.

Achieving this level of success will require the metaverse to become a destination that people want and need to visit. Architects, more than any other profession, understand what is required to create compelling environments, placing the profession in a unique position to take a lead in bringing definition to the metaverse.

VICEverse
VICEverse by BIG. Image courtesy of Vice Media

Currently, the ultimate metaverse vision of a single, universal and immersive virtual world — facilitated by the use of virtual and augmented reality (VR/AR) headsets — exists only as a hypothetical.

While some components and technologies are available today, it certainly isn’t universal, existing on a limited number of disconnected platforms like Roblox, Decentraland, Fortnite and Mona. For the metaverse to live up to the hype, it needs to become far more interconnected, enabling the type of collaboration and immersion previously only possible in the built environment.

Architects could lead the way

Long-time experts in creating the built environment, architects and designers are well ahead of the game and poised to take the lead in the metaverse’s creation.

The modern architectural design process requires teams well-versed in design technologies including BIM, 3D rendering and virtual reality to create immersive virtual spaces that showcase projects before they are actually built.

In addition to that creative process, there’s also a requirement for the design and delivery of a building to be highly collaborative, capable of engaging a wide range of stakeholders and consultants, while also operating within strict constraints when it comes to building codes, financial costs and physics. If one were to look for a cohort prepped and ready to take on the challenge of designing the metaverse, they would be hard pressed to find a better group than architects.

As metaverse development advances, architects have the opportunity to capitalise on their expertise honed in the delivery of building in the physical world to lead the way in designing and delivering metaverse architecture governed by a new, and in some ways liberating, set of constraints.

For example, instead of considering circulation requirements and climatically driven water management strategies, architects can design spaces where occupants can teleport to, from and within a building. The pesky rules of physics, including water penetration risk, no longer apply.

Perhaps even more importantly, architects understand the ethical implications that come with designing environments for people. To quote from the American Institute of Architects, they have a longstanding professional obligation to “design for human dignity and the health, safety and welfare of the public.” From architects’ experience using visual communication tools during the design process, and from studying the way buildings interact with their occupants, architects can bring to the metaverse a much needed ethical awareness of how it might affect users, for both good and bad. With their ethical standards, architects can be a key voice for ensuring the metaverse is a constructive environment, rather than a new medium for manipulation.

While the metaverse may be in its ascendency there is clearly room for architects to assert themselves and help the metaverse realise its potential and move from trend to established technology.

A metaverse architecture

Despite skill alignment, it can be challenging for architects to identify the right metaverse entry points. Here are three tips on defining opportunities and taking the first steps toward metaverse development projects:

Consider areas ripe for metaverse development

There are several key sectors that are prime for early metaverse development. These include commerce spaces that help companies showcase their products; media and entertainment spaces, including concert venues, fashion runways and galleries; and virtual offices for companies to develop and manage the critical collaboration spaces they need for contemporary hybrid work environments.

Firms can prepare to capture new opportunities by ensuring that they have a keen understanding of the metaverse culture and how its space impacts its users

For example, Danish architecture studio BIG recently worked with Vice Media Group, a digital media and broadcasting company, to design a virtual office for Vice employees. Dubbed Viceverse, the project has given the media firm an opportunity to inspire the metaverse community and at the same time redefine journalism.

“It’s not about just getting in there and planting a flag. We wanted to do it in style with aesthetics and ideas when creating, so we teamed up with BIG,” said Morten Grubak, global executive creative director of innovation, Vice Media Group.

Global architecture firm IAXR has also embarked on an exciting metaverse project, creating an office space that enables interior designers and project stakeholders to collaborate virtually. Their metaverse workspace is accessed via VR and acts much like an actual office space with custom workflows that mimic the way they are accustomed to working.

Build a pipeline of opportunities

Architects can begin to identify opportunities for offering metaverse development by targeting both existing and new clients. Firms that don’t yet have a portfolio of metaverse work to share can benefit from building their own presence in the metaverse and offering to develop pro bono metaverse spaces that showcase their talents and capabilities.

For a universe dependent on collaboration, it makes perfect sense for firms to partner with others to explore metaverse opportunities. They can identify firms or architects that they’d like to work with, ideally those with metaverse experience, and reach out using alternative platforms, such as Discord, to expand their reach and capabilities and explore potential clients.

Conscientiously market and educate Selling a new service, including metaverse development, requires strategic marketing efforts. This includes marketing the capability as an additional service option to existing clients, for instance offering to build a metaverse commerce space for showcasing products, and advertising to a broader universe of potential clients.

In helping to define the metaverse, architects also have a responsibility to leverage their communication tactics to educate metaverse users, including the general public, about how they can be conscious consumers. Creating a code of ethics for metaverse projects that can be shared with clients and future users of the metaverse environments can help architects convey this important message.

As the metaverse unfolds, organisations across industries will look to create a stake and turn to architects and designers for guidance and know-how. Firms can prepare to capture these new opportunities by ensuring that they have a keen understanding of the metaverse culture and how its space impacts its users. Their success also depends on having the people, workflows, tech stacks and partnerships to build and manage the spaces as they evolve, and to tackle issues as they emerge, including ensuring proper governance and safe, seamless data transfer between platforms.

Top image: Hasham’s Metaverse: Hisham Laila, Creative Manager

The post How architects can help the Metaverse live up to its hype appeared first on AEC Magazine.

]]>
https://aecmag.com/vr-mr/how-architects-can-help-the-metaverse-live-up-to-its-hype/feed/ 0
Gamma AR optimises BIM model placement on site https://aecmag.com/vr-mr/gamma-ar-optimises-bim-model-placement-on-site/ https://aecmag.com/vr-mr/gamma-ar-optimises-bim-model-placement-on-site/#disqus_thread Wed, 07 Sep 2022 12:17:59 +0000 https://aecmag.com/?p=15421 BIM models now snap to physical corners and edges of walls and columns on site automatically

The post Gamma AR optimises BIM model placement on site appeared first on AEC Magazine.

]]>
BIM models now snap automatically to physical corners and edges of walls and columns

The developers of Gamma AR, the Augmented Reality (AR) construction app, have made it easier and faster to overlay BIM models on site

BIM models can now snap to physical corners and edges of walls and columns on site automatically. According to the developers, identifying the precise corner and having an exact alignment allows users to more accurately overlay their models, in order to check the progress of projects and identify issues with pinpoint clarity.

When using an iPad / iPhone Pro with the LiDAR scanner a vertical snapping detects the corner even if it is hidden

Bringing BIM models to the construction site through AR is useful for tracking and documenting project progress. Gamma AR, which runs on iPhone, iPad, or Android (online or offline) can be used to create issues including photos, comments, notes, records, etc. 

For synchronising issues, the Gamma AR app supports IFC, Autodesk BIM 360, Autodesk Revit, and Autodesk Navisworks.

All the information collected on site can also be delivered in real-time to the Gamma BIM Portal. Models and issues can then be synchronised with Autodesk Construction Cloud, Autodesk BIM Collaborate Pro, Autodesk BIM 360 and Autodesk Build.

Issue information can also be downloaded as BCF, CSV and PDF to communicate with stakeholders.

The post Gamma AR optimises BIM model placement on site appeared first on AEC Magazine.

]]>
https://aecmag.com/vr-mr/gamma-ar-optimises-bim-model-placement-on-site/feed/ 0
Unity Reflect Review now available on Meta Quest 2 https://aecmag.com/vr-mr/unity-reflect-review-now-available-on-meta-quest-2/ https://aecmag.com/vr-mr/unity-reflect-review-now-available-on-meta-quest-2/#disqus_thread Fri, 05 Aug 2022 08:43:56 +0000 https://aecmag.com/?p=15201 Norwegian multidisciplinary engineering and design consultancy Norconsult using untethered VR headset for collaborative design review

The post Unity Reflect Review now available on Meta Quest 2 appeared first on AEC Magazine.

]]>
Norwegian multidisciplinary engineering and design consultancy Norconsult using untethered VR headset for collaborative design review

Unity Reflect Review, the immersive design review solution for architects and designers, is now available for the Meta Quest 2, the all-in-one VR headset previously known as Oculus Quest 2.

Norconsult, a leading Norwegian multidisciplinary engineering and design consultancy, has been using the new hardware/software combination to cover all phases of a project lifecycle – from the earliest pre-investment and feasibility studies, through planning and design, tendering and construction supervision, to project implementation, operations and maintenance.

“Combining Meta Quest 2 and Unity Reflect Review is a true game changer,” says Marius Jablonskis, digital transformation leader at Norconsult.

“XR was great before, but it was not for everyone. Hardware-heavy processes, cables, tracking stations, remembering to charge multiple devices, logging in and out to multiple accounts, pre-processing the data, exporting, packing and updating it – all these operations made the XR world an exclusive club.

“Unity and Meta’s fusion eliminated all the irritating moments and bottlenecks from the process. Now you just have to pop on the glasses and you’re good to go!”

Norconsult uses Autodesk Revit for design, but typically goes from Navisworks or BIM 360 to Unity Reflect Review.

“We use Unity Reflect Review for visual reviews – including stakeholder, design and safety evaluations – on desktop and in VR and augmented reality (AR),” explains Jablonskis.

Prior to the new integration Norconsult used a custom-developed application on the Unity Editor that compiled .apk files which were loaded to untethered Meta Quest 2 devices, as Jablonskis explains. “The application was used for design review and practical safety evaluations with our safety and design experts, customers, and their operation personnel to evaluate safety aspects of the design and placement of equipment in emergency scenarios.”

“Instead of running a traditional session where everyone looks at the drawings – which would have been a challenge in the middle of a pandemic with capacity and other restrictions – we had several VR operation stations where multiple people could participate at once and the rest could livestream on the screen. That way everyone could participate in a live evaluation and feedback session regardless of their location,” he adds.

“Results were directly integrated into our project management system to ensure optimal insight and dataflow so that no disconnected reports ended up in unmonitored folders.”

The post Unity Reflect Review now available on Meta Quest 2 appeared first on AEC Magazine.

]]>
https://aecmag.com/vr-mr/unity-reflect-review-now-available-on-meta-quest-2/feed/ 0
Theorem-XR adds support for AEC systems https://aecmag.com/vr-mr/theorem-xr-adds-support-for-aec-systems/ https://aecmag.com/vr-mr/theorem-xr-adds-support-for-aec-systems/#disqus_thread Wed, 03 Aug 2022 11:32:05 +0000 https://aecmag.com/?p=15172 eXtended Reality (XR) solution now work with Revit, Navisworks and IFC data

The post Theorem-XR adds support for AEC systems appeared first on AEC Magazine.

]]>
eXtended Reality (XR) solution can now combine data from Revit, Navisworks and IFC with 3D CAD across a range of AR, MR and VR devices

Theorem-XR, an eXtended Reality (XR) solution that enables design, engineering and manufacturing firms to optimise, visualise, and collaborate around 3D design data regardless of location, now supports Revit, Navisworks and Industry Foundation Classes (IFC) formats.

With the new Q2 2022 release users can also load data from multiple sources into a single session. In addition to BIM, this includes 3D CAD (3DExperience, Catia, Alias, Creo, FBX, Inventor, NX, Solidworks, STEP and VRED), Product Lifecycle Management (PLM) data (JT) and scanned data.

According to Theorem Solutions, preparing data for Theorem-XR is a ‘seamless, fully automated process’. Users can ‘save as’ from their CAD session, drag and drop from the file system or drive it directly from their PLM workflow. Data can also be used for the creation of Unity or Unreal assets for internally developed XR solutions.

Theorem-XR works across a range of Augmented, Mixed and Virtual Reality devices, including Microsoft HoloLens 2, HTC Vive, Oculus Rift, Magic Leap, Oculus Quest and Android and iOS devices. To support large datasets across multiple devices, Theorem-XR uses Azure Remote Rendering (ARR).

According to Theorem, being able to offer remote rendering across all devices is especially beneficial to construction and factory layout, as it removes the barriers associated to working and interacting with huge quantities of design data.

Other new features of Theorem-XR Q2 2022 include support for the new location features in HoloLens 2, where users can define where 3D models appear in relation to a QR code.

According to Theorem, being able to render data in this way brings ‘greater accuracy and flexibility to the review process for tracking digital content against physical objects.

Another new feature for Theorem-XR with HoloLens 2 is the ability to markup digital design data in a live session using a ‘holographic pen’.

Choosing the pen tool from the system menu enables users to add annotations to highlight required changes, potential clash issues or to leave a note of requirements for users unable to attend collaborative sessions.

Finally, users in VR can now view the assembly structure as they would in CAD. The option to highlight specific parts and components is designed to make it easier to navigate designs and go straight to the components they need to review with their scene.

The post Theorem-XR adds support for AEC systems appeared first on AEC Magazine.

]]>
https://aecmag.com/vr-mr/theorem-xr-adds-support-for-aec-systems/feed/ 0
Bricsys teams up with Vrex to accelerate VR workflow https://aecmag.com/vr-mr/bricsys-teams-up-with-vrex-to-accelerate-vr-workflow/ https://aecmag.com/vr-mr/bricsys-teams-up-with-vrex-to-accelerate-vr-workflow/#disqus_thread Wed, 03 Aug 2022 10:42:28 +0000 https://aecmag.com/?p=15179 Users can create detailed models in BricsCAD BIM and export to Vrex in shared virtual space

The post Bricsys teams up with Vrex to accelerate VR workflow appeared first on AEC Magazine.

]]>
Users can create detailed models in BricsCAD BIM and export to Vrex in shared virtual space

Bricsys is collaborating with VR platform Vrex so users of BricsCAD BIM can more easily export models to the Vrex Virtual Reality (VR) platform.

The enhanced workflow is designed to enable architecture, engineering, construction, and operations (AECO) companies to streamline collaboration between the multiple stakeholders working together on a project in VR.  Users can virtually meet inside the model, perform visual inspections and exchange buildings and project data from ‘any location’.

Both platforms support the exchange of BIM Collaboration Format (BCF) issues via interface services such as BIM Track, BIMsync and BIMcollab. According to Bricsys, this allows for the easy communication of issues emerging during the virtual meeting to be resolved within BricsCAD BIM.

“Vrex integrating with BricsCAD is another step towards the global BIM integration we want to achieve,” said Hans Fredrik Johansen, CEO of Vrex. “With Vrex-BricsCAD anyone, regardless of their experience level, can step into the 3D model and understand them instantly. It allows for crystal clear communication across teams and seamless collaboration in a virtual reality environment.”

Vrex also offers integrations with Autodesk BIM 360, Autodesk Navisworks, StreamBIM and Aconex. In addition to BIM data, Vrex supports point clouds, which can be used for QC/QA workflows.

The post Bricsys teams up with Vrex to accelerate VR workflow appeared first on AEC Magazine.

]]>
https://aecmag.com/vr-mr/bricsys-teams-up-with-vrex-to-accelerate-vr-workflow/feed/ 0
ArcGIS Maps SDK for Unreal Engine launches https://aecmag.com/gis/arcgis-maps-sdk-for-unreal-engine-launches/ https://aecmag.com/gis/arcgis-maps-sdk-for-unreal-engine-launches/#disqus_thread Thu, 21 Jul 2022 09:19:27 +0000 https://aecmag.com/?p=15097 SDK offers tools for bringing data from ArcGIS directly into the Unreal Engine 5 development environment

The post ArcGIS Maps SDK for Unreal Engine launches appeared first on AEC Magazine.

]]>
ArcGIS Maps SDK for Unreal Engine offers a set of tools for bringing data from ArcGIS directly into the Unreal Engine 5 development environment.

Esri has released version 1.0 of the ArcGIS Maps SDK (Software Development Kit) for Unreal Engine, which is designed to enable developers to build ‘world-scale’ AR, tabletop AR, and VR experiences for a range of sectors including AEC, utilities, and transportation.

According to Euan Cameron CTO, developer technology at Esri, the SDK will allow developers to use the Unreal Engine 5 game engine to create new classes of applications around their data, creating immersive ways to visualize and interact with real-world GIS assets that complement their current workflows.

The SDK is the result of a long-standing collaboration between Esri and Epic Games. One of the key outcomes of that relationship has been Epic Games’ development of the GeoReferencing plugin for Unreal Engine which led to full support in Unreal Engine 5 for double-precision coordinates and the ability to accurately place geographic data on a global scale.

Version 1.0 of the ArcGIS Maps SDK for Unreal Engine includes support for basemaps, which depict relatively static features like streets, buildings, facilities, and landscape details, elevation data for terrains, and geospatial layers such as 3D objects and integrated meshes.

According to Esri, Version 1.0 is just the beginning. The company explains that there’s a ‘rich vein of functionality’ waiting to be tapped into, such as Vector Tile Layers, Point Cloud layers, Feature support, geocoding, routing, and other analysis tools.

ArcGIS Maps SDK for Unreal Engine
Detailed 3D Object scene layer of Rotterdam buildings complete with textures, overlaid on Esri’s global imagery basemap.
ArcGIS Maps SDK for Unreal Engine
An integrated mesh of Girona, Spain, overlaid on top of global imagery and elevation

“At Epic, we’re committed to building the most open and advanced real-time 3D creation tool, freeing up digital creators to leverage Unreal Engine’s capabilities as part of their existing content creation pipeline,” said Sébastien Lozé, Unreal Engine Simulations Business Director at Epic Games. “The work done by the Esri team in this collaboration will allow the entire GeoInt community to leverage the rich diversity of created content existing in the ArcGIS ecosystem, and we’re excited to see how the release of the ArcGIS Maps SDK plugin for Unreal Engine will create virtually unlimited immersive, interactive, synthetic environments in Unreal Engine 5.”

Urban planning and design firm Houseal Lavigne is an early adopter of the SDK. The company was recently hired by the Village of Glen Ellyn, Illinois, a Chicago suburban community, to build an application to assist the community in evaluating development proposals

To help maintain the character of the cherished downtown main street in Glen Ellyn, Houseal Lavigne built an immersive video game-like application with Unreal Engine.

The app allows members of the community to see if a proposal fits into the existing context of their downtown by placing themselves directly into the 3D environment and experiencing potential change first hand. The app uses the ArcGIS Maps SDK for Unreal Engine to create an immersive environment based on 3D scene layers.

ArcGIS Maps SDK for Unreal Engine
A proposed development shown using Houseal Lavigne’s Immersive Development Viewer.

The SDK simplifies application updates by using ArcGIS data sources directly. It dynamically streams in the built environment from ArcGIS Online. Then, when the area changes, the client can publish a new scene layer and the application will update itself.

“It used to be that when we would talk to clients and client communities about change, we would show it on a map. But expectations have changed. Like everyone else, our clients and their constituents have been immersed in video games like Fortnite, and they watch movies shot on sound stages in front of green screens with beautiful 3D worlds drawn in behind the actors. They not only want to see a potential change, but they also want to experience it,” said Devin Lavigne principal and founder, Houseal Lavigne.

The ArcGIS Maps SDK for Unreal Engine is free to download.


Main image caption: San Francisco textured buildings with demographic data, global imagery, and elevation, viewed from within the Unreal Engine editor.

The post ArcGIS Maps SDK for Unreal Engine launches appeared first on AEC Magazine.

]]>
https://aecmag.com/gis/arcgis-maps-sdk-for-unreal-engine-launches/feed/ 0
Arkio in the Metaverse of Madness https://aecmag.com/vr-mr/arkio-in-the-metaverse-of-madness/ https://aecmag.com/vr-mr/arkio-in-the-metaverse-of-madness/#disqus_thread Wed, 25 May 2022 12:58:50 +0000 https://aecmag.com/?p=14295 The collaborative spatial design software has received a pile of new features for its official launch on Meta’s Quest App store

The post Arkio in the Metaverse of Madness appeared first on AEC Magazine.

]]>
Collaborative Spatial Design developers Arkio had an unusually quiet 2021. Little did we know they were storing up a huge pile of new features for their official launch on Meta’s Quest App store. Martyn Day looks at the new capabilities

It’s been almost five years since we first met Johan Hanegraaf at NXT BLD, where he gave a demonstration of a personal project that explored what a VR design system for architects would look like.

Now a professional product called Arkio, with users all around the world, the greatly refined application allows teams to share models for collaborative VR sessions, supporting data from Revit, Rhino, Unity and other 3D tools, and providing basic modelling capabilities.

While firms could access the application on the Meta Quest store if they searched for the exact name, the app was still in the development category (‘Applab’). This month, however, Arkio made its official debut with version 1.2. And what an update this is! It’s a significant leap in capability.

The basics

There has been a lot of work done on the UI as you enter the application, making it more of a personal experience and easier to sort through collections of previous models and access training. There is a new ‘meta’ concept of the Arkio HQ, which are virtual buildings in the Arkio VR – a gallery building, an auditorium and a training centre.

The gallery building contains exhibits of work done by people using Arkio, which will continue to evolve. The auditorium will be used to hold live demonstrations from the development team, while the training space will contain lessons to see what’s possible with Arkio.

Arkio
Arkio has boosted its support for BIM data

The concept of the exhibition space is particularly interesting. While it’s a demonstration of work that’s been done by users of the application, I wonder if it’s possible to have your own Arkio exhibition building, where firms could store and exhibit models of their past work, or even perhaps projects currently on site. All those competition entries which fail to get built could find a home in a permanent virtual exhibition for perspective clients, or employees, to see the practice’s work in 3D.

While there is a lot of rubbish spouted about the metaverse, currently the marketing of architects’ work resides on websites, in photos and sketches. Given we’re all making 3D models, Arkio can reuse that data for everyone else to experience in VR.

Avatars and hands

In the previous version, avatars were pretty shapeless forms. Now Arkio has adopted the Meta-style avatars, bringing some sense of personification to the individuals in a collaborative session. If you have set up your avatar in Meta already, then this is the version that will appear in Arkio automatically.

In the Oculus and Rift environments hands are now ‘a thing’ that appear in the view and support for hand tracking is enabled. This might sound like an odd thing, perhaps a waste of polygons, but it’s a massive update and helps anchor the user in the VR environment.

The Arkio team have been working very closely with Meta to improve performance. Now models which are 3x larger can be imported to the Quest. The frame rate has doubled and scenes can be over twice as complex. This has required a large amount of work from the development team and has meant the replacing of the graphics rendering pipeline.

The addition of meta-style avatars and hands might sound like a waste of polygons, but it brings some sense of personification to the collaborative session and and helps anchor the user in the VR environment

When using Arkio for simple massing, obviously you’ll not be pushing the system particularly hard, but those who want to import complex models from Revit or SketchUp will appreciate the additional headroom. Considering standalone VR headsets like the Quest are accelerated by low power mobile CPUs/ GPUs, it’s frankly quite amazing. Unlike games geometry, BIM systems are much less optimised for graphics performance, or rely on having powerful CPU and GPUs. Arkio is doing a sterling job.

Precision and components

The lack of precision editing in geometry has also been addressed. While it’s still possible to grab and pull faces, for typically inaccurate massing editing, it’s now also possible to edit by typing in the dimensions to drive the geometry. This feature has been highly requested by users and pushes the direction of development from addressing simple massing to becoming more of an all-round architectural modelling solution.

And if enhanced precision wasn’t a good enough indicator of Arkio’s modelling ambitions, the new release contains the first draft of an architectural components library.

Previous releases were limited to creating prismatic shapes and then editing them. This was good enough for massing studies but, when it came to architectural modelling, the lack of accuracy and the need to model with basic shapes limited the outcomes.

The introduction of architectural components is a significant milestone. If anything, it’s probably the most significant part of this release which is packed full of fantastic new features.

Arkio now has a library of windows, doors, stairs etc. — over 100 parametric components. In the last release one would simply punch a hole through a wall to create a door or a window. Now it’s possible to draw with components. Simply by placing them, they automatically cut through walls and generate to fit the space required.

I’ve been told that there are capabilities not unlike Revit families underpinning this, which will eventually be fully exposed for users to create and customise their own libraries.

This capability is a marked improvement and is a demonstration of delivering a tool with real design intent. The combination of component library with greater feedback of dimensions takes Arkio to another level.

Pass through modelling

When I first saw the new pass-through modelling feature demonstrated, I wondered how on earth I was going to explain it in words. So here goes. The Oculus Quest headset has what is called a pass-through camera. This is a low resolution black and white camera which is primarily used to help the user initially set up a safe zone and spatial barrier in which to move around.

As you are completely blind while wearing a VR headset, pass-through is a way to connect to the real world. Arkio 1.2 has introduced the concept of passthrough modelling.

From within the VR environment, it’s possible to toggle the pass-through camera. This will display the room you’re in and allows you to model the walls, floor or ceiling.

Arkio
Pass through modelling in Arkio

Arkio has a new transparent material called pass-through, which allows you to see the edges of the geometry in passthrough camera mode. If you punch a hole through any of these walls, the view is of the VR world beyond. So you’re in a real room, seeing a live feed from a camera, but the view through the window is of the Arkio VR world. This is mildly disconcerting. It’s also possible to model rectangles around the furniture in your room and literally delete it and replace the furniture with Arkio components. So now you can edit the real world, and place virtual furniture within it, while looking out into the virtual world beyond!

The next generation of headsets will have full colour pass-through and, hopefully, higher resolution. This blended reality capability is really at a formative stage but gives an indication of where Arkio is heading. The development team said they always envisaged the product to be both VR and AR, so as AR glasses come out, expect to see Arkio models blending in.

Unity

This release sees a lot of effort put into connecting Arkio with Unity, with scenes being able to be round tripped between them.

Up to 2,000 game objects can be brought into Arkio, moved or modelled on top of, and then sent back to Unity. It seems game developers are also interested in using the software for level design.

While this may seem not important for architecture, expanded usage will also drive improvements. To get performance, games developers heavily utilise level of detail (LoD) techniques, so models are displayed with higher fidelity nearer the viewpoint, and lower fidelity further away. While Arkio currently doesn’t support LoD optimisation, it will be coming at some point. For architects this means the possibility of loading bigger models and having faster VR.

For me, the most important new feature is the inclusion of a component library, which points towards taking the product beyond SketchUp levels of architectural modelling

There are also a range of smaller individual features, like the addition of sticky notes, the incorporation of more Revit data and the ability to sketch in 3D.

Conclusion

This has been a massive update for the collaborative VR software. It’s clear to see that now the groundwork has been done on the base system, the development teams are fleshing out and refining more advanced capabilities, across a wide range of functionality. For me, the most important new feature is the inclusion of a component library, which points towards taking the product beyond SketchUp levels of architectural modelling. Massing is great but I had always hoped for more, and now it’s here.

Arkio also looks set to become the gateway tool to repurpose architectural models for the metaverse. The team has obviously been working very closely with Meta and I’m sure it’s not lost on the Meta team that this is a great bridge to get the industry into the metaverse – should we wish to cross it. While I am not sure about 99% of the hype around the metaverse, I can buy into the concept of there being a virtual space where architects can collate and repurpose their historic 3D projects and allow customers, students or fans to experience their built and unbuilt work in VR.

The post Arkio in the Metaverse of Madness appeared first on AEC Magazine.

]]>
https://aecmag.com/vr-mr/arkio-in-the-metaverse-of-madness/feed/ 0