BIM Archives - AEC Magazine https://aecmag.com/bim/ Technology for the product lifecycle Thu, 31 Aug 2023 10:07:02 +0000 en-GB hourly 1 https://wordpress.org/?v=6.2.2 https://aecmag.com/wp-content/uploads/2021/02/cropped-aec-favicon-32x32.png BIM Archives - AEC Magazine https://aecmag.com/bim/ 32 32 The Future AEC Software Specification https://aecmag.com/bim/the-future-aec-software-specification/ https://aecmag.com/bim/the-future-aec-software-specification/#disqus_thread Thu, 27 Jul 2023 08:57:16 +0000 https://aecmag.com/?p=18252 A wish list of what leading edge BIM practices want from next generation tools

The post The Future AEC Software Specification appeared first on AEC Magazine.

]]>
At our inaugural NXT DEV conference, one of the keynote talks, given by Aaron Perry of the Open Letters Group, announced a free to all Future AEC Software Specification backed by global firms such as BIG, HOK, Herzog & de Meuron, KPF, Woods Bagot and ZHA. What should next generation tools be able to do? Martyn Day gives the back story

In London on 21 June, at AEC Magazine’s NXT DEV event, Aaron Perry, head of digital design at AHMM, gave a masterful presentation on the ‘Future AEC Software Specification’, a document that looks destined to have major influence on the future of AEC software.

Perry spoke on behalf of a peer group of leading architectural practices, which have been discussing the future of design tools and how to drive development across the entire design software community.

The specification originated with the UK Open Letter Group (OLG), who wrote that infamous letter to Autodesk CEO, Andrew Anagnost in 2020, but it is a much wider global effort.

It is currently being shared amongst the Nordic OLG firms (authors of a second open letter), and the highly influential AIA Large Firm Roundtable (LFRT) for approval and revisions.

The specification addresses the whole software development community, including those that invest in technology, and provides a level of customer insight that, quite frankly, is not available anywhere else. It asks, what is the future of our industry from a digital design point of view? It is a truly unique offering.

In his talk, Perry first points out that the specification is not a definition for a single monolithic product. The industry needs an ecosystem of best-in-class tools that can work together and are applicable to large firms all the way down to single users. There are ten pillars to the specification (see below). We’ll look at some of the key ones below.


The Future AEC Software Specification


Aaron Perry Future AEC Software Specification presentation on-demand


Data framework

Fundamental to the specification is the data framework. When trying to deliver complicated projects, firms use many software tools. Every time data is exported between the various siloed tools, data is lost. As a result, most firms accept the status quo and try not to move their data between various file formats, choosing to use suites of average tools.

But many firms still try and use the best tool for the right problem, but they are paying for it through translators, complexity, plug-ins, specialist software. This is the biggest drain on time and energy in the industry. A solution would be something like USD, (Universal Scene Description) which is open source and solved most of the data transmission problems in the CGI world. It’s a common standard, transmits geometry, lighting, materials to pretty much any media and entertainment application, whoever developed it.

The AEC industry faces a similar problem. With USD the common data structure sits outside the host products. This also means different teams, with different specialisms can open the USD file and work on their part of the scene. This creates a level playing field for software developers.

Read the Mission Statement of the Future AEC Software Specification

According to Perry, USD has some challenges. It’s a file format, as opposed to a database, there are variants as it’s not a perfect specification, and it is not the right format for BIM data.

Perry explained how different participants in projects need to access different levels of detail of data – specific data sets for structural engineers, cost consultants, sub-contractors who don’t need the whole model, energy simulation experts.

“The concept of the data framework being outside of these file formats, enabling different people to access author modified geometry and parameters, concurrently. This is by entity or component, not requiring the opening of the full model to review a single string of data,” he said.

“At this entity level, collaborators can concurrently commit changes and coauthor packages of information with a full audit trail. Moving away from file formats and having a centralised data framework enables local data to be committed to a centralised data framework and allow entire supply chains to access different aspects of the whole project without the loss of geometry.”

As an industry, we are currently locked into proprietary file formats. In the next generation, Perry clearly makes a case for shared and distributed ownership without the vendor lock-ins that the industry has suffered from.

Moving away from file formats and having a centralised data framework enables local data to be committed to a centralised data framework and allow entire supply chains to access different aspects of the whole project without the loss of geometry

Context and scale were others aspects of data that Perry commented on – with the need for detail, such as an individual sprinkler, in relation to the whole building. Game engines like Unreal with Nanite technology can model dust particles and sand, while scaling up geometry to the size of countries. Models are being rebuilt and remade for this level of detail. We need modern performant software from concept to construction levels of detail, he said.


Find this article plus many more in the July / August 2023 Edition of AEC Magazine
👉 Subscribe FREE here 👈

Responsible design

Projects are increasingly becoming centred on retrofit. With sustainability at the core, the focus is on reinventing the buildings we already have. Time is spent analysing the fabric of existing buildings but there are no tools available to help this. Perry explained that we need software that understands construction and embodied carbon – every major design firm has written their own. We also need more tools for operational energy, water use, climate design, occupant health, community – we are not seeing any tools with this on their agenda, he said.

MMC and DfMA

Modern methods of Construction, DfMA and Off-site are increasing and firms are looking for more certainty in construction. There are design constraints for shipping volumetric buildings to site, yet Perry explained that 95% of software currently available will not allow design to create design constraints when conceptually massing studies.

Most of the massing studies software, and there have been many that have come to market, is not fit for purpose. In fact, design tools do not allow architects to go down to the fabrication level of detail required. There is no modular intelligence.

Modelling capabilities

Moving from drawing board to CAD was pretty straightforward. Moving to object-based modelling was more challenging. Perry calls for more flexible tools, smarter tools that have construction intelligence, “We have either flexible tools without structure, or structured tools with no flexibility. Neither of which understand how buildings come together”.

Automation and intelligence

The industry is highly repetitive, yet across all firms, despite there being experience and knowledge in abundance, that knowledge gets applied from first principles each time, without learning anything from previous projects. Can we have tools that capture that knowledge?

The core size is manually calculated for every single building, over and over again, as the building envelope changes. When talking about repetitive tasks, like creating fire plans for multi-floored buildings, Perry asks ‘Where’s Clippy? We have no automation from past projects”.

Deliverables

The industry is spending far longer creating drawings than designing. “Why are we still generating drawings on A1 and A3 sized templates but these are never printed by anyone?” asked Perry.

“The drawing production process is about the release and exchange and acceptance between multiple parties and it’s sticking around for the time being because it’s our sole method, as an industry, to transact between different parties but what is the lifespan during production?” Using a centralised data framework with automation might become an alternative. Drawing production might become more incidental in the future.

Access data and licensing

As subscription has changed the way customers pay for tools, there have been far too many experiments with business models. Floating licences have been expunged for individual named user licences. Firms end up restricting access to those products. “Commercial limits are commercial limits.” said Perry. Token based systems expect the customer to gamble on future usage needs.

“We are seeing a trend in some vendors by charging a percentage of construction value. The reality is that by doing that you are pushing the decision making process of what design software to use to the person paying the bills and setting the total values for projects, so we’re effectively walking into a situation where we’re saying ‘the commercial decision for purchasing software now sits with the pension investment fund that owns the budget.” Needless to say Perry is not a fan.

In his hour-long presentation, Perry outlined just a few of the topics dealt with in the Future AEC Software Specification, which can be seen in full here.

To watch Perry’s excellent presentation and the panel discussion that followed, click here

Conclusion

We are at an interesting point in AEC software development. Mature customers are ahead of the software developers, wanting more, and demanding higher productivity benefits. The leading software developer, Autodesk, has just started replacing its tyre at 90mph, working out how to migrate Revit data to its new unified database, while figuring out what design tools Autodesk Forma will offer.

At the same time, new start-up firms are listening to mature customers and pondering what they do for the mass market play. The Future AEC Software Specification is the generic guide to all developers in the industry, to think about the kinds of tools that large firms would like to see, beyond what is currently available.

From watching Perry’s crystal clear talk at NXT DEV, and dropping into the conversations on the various panel sessions, it’s clear that there is plenty of opportunity for the industry to work together to develop the kind of capabilities which many would love to have right now.

AEC Magazine hopes to put on a yearly event where we continue to directly connect innovative practices with software developers and VCs. It’s reassuring to hear from the OLG that large construction firms have been inspired by Perry’s talk and the release of the software specification and are already looking to join the initiative and contribute to the specification to make it broader and pan-discipline applicable.


Mission statement
Future AEC Software Specification

“Our aim is to set out an open-source specification for future design tools that facilitate good design and construction, by enabling creative practice and supporting the production of construction-ready data.

“This specification envisages an ecosystem of tools, for use by large firms and single practitioners alike, with a choice of modular applications each with specific functions, overlaid on a unifying “data framework” to enable efficient collaboration with design, construction and supply chain partners.

“As architects, engineers and contractors, we do not profess to have all the expertise needed to enable this ecosystem, but we do know what we need to do our work, and so we are looking to engage with the software community to deliver the best solution.”


 A Q & A with Aaron Perry (AHMM) and Andy Watts (Grimshaws)

We caught up with Aaron Perry (AHMM) and Andy Watts (Grimshaws), after NXT DEV to show to find out what kind of response they had to the Future AEC Software Specification.


AEC Magazine: What was the immediate reaction at NXT DEV after you came off stage? What did people say to you?

Aaron Perry: When I came off stage, I couldn’t move two metres without another group of the audience wishing to speak with me. I was surprised how many people really ‘got’ the spec’s intentions. To some extent, I expected many to ask, ‘What about IFC?’ and most people got that we aren’t talking about file formats but web-enabled data exchanges.

Andy Watts: It was also surprising the range of interest we got – from engineers, developers, contractors, even the likes of KPMG. The way Aaron pitched the presentation made sure there was resonation with everyone, regardless of discipline.”

Aaron Perry: Online, it’s been overwhelmingly positive, I expected trolls to express alternative opinions for the sake of it, but the feedback so far has been very positive and complimentary of the initiative. I’ve spoken with vendors and designers, and it’s been encouraging.

Andy Watts: After a few weeks to digest, we’ve started to see some more mature engagement as well, readers wanting to know how this will work.


AEC Magazine: You have HOK and BIG signed up. What’s happening with the the AIA LFRT and the Nordic Open Letter Group. Are they in the process of approving it?

Aaron Perry: The website has been updated with supporters, and there are many that we haven’t even managed to get around to adding. We received confirmation from the Architectural Associations of Finland, Norway, Iceland and Denmark on the morning of NXT DEV. Since this, we’ve also had Central European associations and firms express interest.

Andy Watts: I’ve met with HOK and Woods Bagot whilst in the US the past couple of weeks. There has been conversation happening informally amongst the US firms – a healthy debate was how that was framed to me. But since then, a number of US firms have also started reaching out – nbbj, Perkins & Will, etc. Greg (HOK) and Shane [Burger] Woods Bagot are also laying the groundwork for a presentation to the LFRT by Aaron and I later in the year.

Beyond that, we’ve started to see Australia get on board in the last few weeks – Cox, Architectus, Aurecon have all started reaching out. And it has recently had some publicity at some AUS events.


AEC Magazine: Construction firms were also interested. Will there be signatories from the construction space?

Aaron Perry: One of the most surprising conversations I had straight after the presentation was from a Tier 1 Main Contractor who said something along the lines of ‘we’re very in bed with Autodesk. We’re actually working with them to build out custom tools for us, but you raise a really valid point that we should challenge vendors too, can we speak more…’ So yes, this is not limited to just designers.

ICE and others have asked us to report to their strategy groups to help them understand where they can contribute.


AEC Magazine: Now that you have delivered the specification what comes next?

Aaron Perry: We’re working on it. We have a solid plan of what steps to take, but I’d be talking out of turn to share that without having received a broader sign-off.

Andy Watts: Agreed. Generally our approach was to put this out in the public to see if it resonated, and then plan from there. We’ve got the resonation – now we’re working on the plan.


AEC Magazine: How can firms contributes to the specification?

Aaron Perry: The spec is live and has started to receive feedback/input. We’re pointing people to that and to add their thoughts and adaptations.

Andy Watts: Also, we’ve seen a lot of engagement through word of mouth, so we’d encourage people to share.


How we arrived at the Future AEC Software Specification

Over the past five years, the majority of investment in the AEC industry, including venture capital, acquisition and development, has focused on construction. This happened at a time when architects and users of the number one design modelling tool, Autodesk Revit, were getting frustrated at the lack of meaningful updates, but seeing increased software subscription costs.

Meanwhile, Autodesk was in full acquisition mode, buying up cloud-based applications to fill out its Construction Cloud service. While there had been long-term rumours of a new generation of solutions from Autodesk based on the cloud, originally called Quantum, there was no set date or anything to show.


Greg Schleusner
In his work on data lakes, Greg Schleusner of HOK has highlighted the inefficiencies of data silos

This frustration led to the formulation and delivery of an open letter to the CEO of Autodesk, Andrew Anagnost, from 25 UK and Australian firms, highlighting a wide range of complaints, from lack of updates to the stuffing of ‘Collections’ with products they don’t ever install, let alone use.

The net result was a series of meetings with Autodesk management and product development staff, exploring what kinds of features these firms would like to see in Revit.

In 2022, Anagnost made it clear that there was not going to be a second generation of Revit as we know it – a desktop application. In an interview with Architosh, he said, “If you want a faster horse, you might not want to work with us because we will not make a faster horse. But we can be that partner and tool provider that supports professionals into the new era of architecture.”

That new era was a future version of Autodesk Forma, the cloud-based platform which softlaunched this year with an initial focus on conceptual design.

Around this time, the Open Letter Group (OLG) were engaged with the Revit development team, providing wish lists for Revit. Meanwhile, a group of signature architects had also approached Autodesk with similar complaints, just without a public letter, and went through the same engagement process, requesting new features.

The Revit team had its own part-published roadmap and there were attempts to see where they could align. But herein lies a problem. When asking for feature wish lists the many will always drown out the few.

Autodesk has sold somewhere between 1.2 and 2 million seats of Revit and the majority of users have fairly rudimentary needs. Meanwhile, many of the firms that were complaining were the mature BIM users, a minority that are really pushing the boundaries and are frustrated at being held back with Revit’s lack of development velocity.

Any boundary-pushing technology or new development was more likely to appear as a cloud service or as part of Autodesk Forma – and this transition could take years

This led to a second open letter, the Nordic Letter which is now signed by 324 firms, including BIG with Danish, Norwegian and Icelandic Architectural Associations, representing some of the firms who had approached Autodesk privately.

The OLG had little faith that Autodesk would build into Revit the advanced features its members need. Any boundary pushing technology or new development was more likely to appear as a cloud service or as part of Autodesk Forma – and this transition could take years.

The OLG decided to work on a specification, a shopping list of the kinds of technologies they would like to see in a next generation product.

By this time, late 2022, the OLG had been approached by nearly every large software company with an interest in either entering the BIM market or tailoring their current offerings to meet the BIM needs of the members – Hexagon, Nemetschek , Dassault Systèmes, Trimble, Graphisoft, BricsCAD, Qonic and others.

The decision was made to collate the specification and just ‘put it out there’ for the benefit of all / any developers looking to create next generation design tools. And there are many, including Snaptrude, Qonic, Arcol, Blue Ocean and Swapp, with more waiting in the wings.

As we move towards a second generation of design tools, one of the key challenges for these software developers is to what extent should they respond to the needs of the firms that have complained? Autodesk is not alone in feeling it needs to cater to the masses.

As one developer told us, “We’re in the volume market and developing for the signature architects and large firms does not make commercial sense for us.”

The problem is, artificial intelligence and machine learning is coming very quickly to this market, and people that use BIM to make simple rectangles and unexciting geometry, are going to be the first to face automation.

In an AI world, those who create complex geometry and more unique designs will be the ones that will still require more software products. AI probably means more Rhino.

Databases over files

Greg Schleusner of HOK has been on a similar tack, looking at next generation tools, specifically concentrating on the change from proprietary to open data formats and from files to data lakes.

The industry wants data sovereignty and independence. It does not want to be trapped in proprietary file formats again. We’ve already been DWGd and RVTd; we need to move beyond the silos and it’s time for open databases, offering freedom to share at a granular level, without the need to write ‘through’ a vendor’s application to gain access to your own or other’s project data, held in proprietary repositories. Schleusner has managed to bring together a consortium of 39 firms to pursue the creation of this open data lake.

To really understand the scope of this, we recommend watching his past three presentations from NXT BLD.

2021 – Creating Balance

2022 – Road to Nowhere

2023 – Assemble

The post The Future AEC Software Specification appeared first on AEC Magazine.

]]>
https://aecmag.com/bim/the-future-aec-software-specification/feed/ 0
AEC July / August 2023 Edition out now https://aecmag.com/technology/aec-july-august-2023-edition-out-now-2/ https://aecmag.com/technology/aec-july-august-2023-edition-out-now-2/#disqus_thread Thu, 27 Jul 2023 15:33:08 +0000 https://aecmag.com/?p=18295 What leading AEC firms want from next generation tools, NXT BLD / DEV on-demand, plus lots more

The post AEC July / August 2023 Edition out now appeared first on AEC Magazine.

]]>

We kick off summer with an important edition of AEC Magazine, where we hear from leading AEC firms about what they want from next generation tools. It’s available to view now, free, along with all our back issues.

Subscribe to the digital edition free + all the latest AEC technology news in your inbox, or take out a print subscription for $49 per year (free to UK AEC professionals).


What’s inside the July / August edition?

The future AEC software specification
At NXT DEV, Aaron Perry of AHMM, backed by many global firms, presented a wish list of what leading edge BIM practices want from next generation tools.

NXT BLD / NXT DEV on-demand
NXT BLD was followed this year by NXT DEV, a conference dedicated to the future of AEC software. All 40 presentations from both events can now be viewed on-demand.

Will AI design your next building? 
Will AI take architects’ jobs too, or will it make them more fulfilling instead? asks Akos Pfemeter, VP, technology partnerships at Graphisoft.

Revizto reflections 
We caught up with Revizto’s CEO to discuss the company’s origins, development path and latest release.

Autodesk Tandem: dashboards
Autodesk has added new dashboarding capabilities to Tandem for obtaining all sorts of metrics from digital twins.

Review: Nvidia RTX 4000 SFF Ada Generation 
Nvidia’s new compact workstation GPU delivers in all types of AEC workflows, but there’s a premium to pay.

Review: AMD Radeon Pro W7900 & W7800
AMD’s new workstation GPUs can’t outpace the Nvidia RTX 6000 Ada, but they can compete on price / performance.

AMD Ryzen 7000 X3D for CAD, viz and simulation
AMD’s new Ryzen CPUs with 3D V-Cache deliver a performance boost in 3D games, but what do they offer AEC professionals?

The post AEC July / August 2023 Edition out now appeared first on AEC Magazine.

]]>
https://aecmag.com/technology/aec-july-august-2023-edition-out-now-2/feed/ 0
Revizto reflections https://aecmag.com/collaboration/revizto-reflections/ https://aecmag.com/collaboration/revizto-reflections/#disqus_thread Tue, 25 Jul 2023 05:47:34 +0000 https://aecmag.com/?p=18171 Martyn Day caught up with Revizto CEO, Arman Gukasyan, to discuss the company’s origins, development path and latest release

The post Revizto reflections appeared first on AEC Magazine.

]]>
While attending Revizto’s London Field Day earlier this year, Martyn Day caught up with CEO, Arman Gukasyan, to discuss the company’s origins, development path and latest release

In the world of digital coordination and project information dissemination, Revizto has been pushing the boundaries since 2011. Headquartered in Lausanne, Switzerland the company quotes that it has over 150,000 AEC users in 150 countries. After 12 years of development, it’s still a private company and has a user-base that’s growing rapidly, including firms such as AECOM, BAM, Atkins, Grimshaw, BDP and Balfour Beatty.

The software has evolved from basic viewing and filtering to being an essential tech stack element. The Revizto cloud-based hub provides a highly performant single source of truth for 2D and 3D project data, together with issue tracking, VR, clash detection and now with a full-power iPhone and tablet client.


Martyn Day: Revizto has appeared in AEC Magazine for some time, but I don’t think we have ever covered the origins of the company and where its technology came from. How did Revizto start?

Arman Gukasyan: It all started following a stint working for Infomap. Part of my role was developing the business and in meetings and discussions with c-level executives about city planning, I found out that they were getting not only different sets of data from different disciplines but also each city and county/region had their own set of standards and guidelines to follow!

With buildings, infrastructure and cities becoming more and more complex, the data they needed was insufficient for their needs. At the time it was all about CAD data as BIM was only just being talked about.

I started to experiment to see which technologies could handle the heavy 3D data without distorting it, and create a lightweight, interactive version, which could be used for communication and collaboration.

It was pretty clear to me that gaming technologies were the way to go, as a number of games included expansive maps and models of cities.

With angel investment I started the business in 2008 and hired the first employee who was a game developer. We wanted to disrupt the AEC industry with technology to support project coordination, collaboration and project communication; and be uncomplicated and scalable.

Our experiments started with 2D AutoCAD data and us creating 3ds Max models. At first, we provided a service for small to medium-sized construction projects, delivering an EXE file which created an interactive way to explore designs. This created a huge clash between owners and architects, as the architects didn’t recognise their designs, as they had never seen all their data imported into one place before! It gave them a whole new context on the project.


Martyn Day: Autodesk bought UK developer Navisworks in 2007. To some extent this was their BIM viewer for the masses. It sounds like rather than delivering an application you were more of a service at the time?

Arman Gukasyan: We delivered two main projects as a service business before deciding that this wasn’t where I wanted the company to be. [The first project was] creating an exact replica of the Olympic Village, both buildings and infrastructure for the Olympic Committee.

They used this model to train 4,500 volunteers six months before the Olympics took place, which hadn’t been done before. One of the Olympic sponsors, Coca Cola used the model to help decide placement of their billboards to assess viewing points.

Revizto CEO, Arman Gukasyan
Revizto CEO, Arman Gukasyan

The second project was a model for La Sagrera in Barcelona, a high-speed train connecting Paris and Barcelona. Our model and simulations helped to identify that the platforms would be too narrow to manage the rush-hour capacity so changed were made to address this issue.

We developed Software as a Service in 2011 when Revit and SketchUp were the main authoring tools. We developed plugins for the model so that data could be ‘sucked out’ of Revit, taken into Revizto and optimised allowing users to interactively explore their designs in a basic way.

Seeing that this was not enough, we then created an issue tracking component to sit on top of it based on Jira from Atlassian and adopted for the AEC industry.

I travelled between Switzerland and the US a fair bit in the early days, so this is where I first recruited a dedicated sales representative. Then we came to the UK, EMEA followed by APAC.

We launched Revizto at Autodesk [University] AU in 2012. We were a very different offering from what was on the market, Revizto’s focus is, and always will be, on ease of use.

After the initial successful launch, we started to develop the issue tracker more, bringing 2D and 3D together, because, even today, 2D is still a big part of the process, when it comes to contractual documents.

In 2015 we developed an automatic overlay with a 3D [model], which proved very popular and in turn, we were being trusted by larger organisations and working on bigger projects.

Although it seemed like the main authoring tool being used by the industry for BIM was Revit, we realised that we had to maintain being platform agnostic and also support others like Bentley, Nemetschek and Trimble so we developed integrations for all the platforms. Every market vertical (civil, rail, oil and gas, mining, architecture etc.) has different tools that they use. You can bring your data into Revizto no matter how you create your data!

In 2017/2018 we added support for point clouds, as firms increasingly started to check the reality against the BIM data. We were the first to work out how to get point clouds on phones and tablets. Our customers can load huge point clouds without any challenges or streaming, caching locally, automatically. When you are using a streaming-based solution, you are very dependent on your bandwidth and where and when you can open a project. In Revizto, once you open a project, it’s cached on your local machine or mobile device (phones and tablets).


Find this article plus many more in the July / August 2023 Edition of AEC Magazine
👉 Subscribe FREE here 👈

Martyn Day: In all Common Data Environments (CDEs) and model viewers, clash detection is always high on the end user wish list. Last year you released a major update with a very good, mature clash detection capability. This caused some issues with Autodesk who refused you a stand at Autodesk University. With Spacemaker and Navisworks, you were suddenly deemed perhaps too competitive?

Arman Gukasyan: We spent several years (2018 – 2021) developing clash detection whilst watching the market to see if solutions such as Navisworks or Solibri were going to be developed further in this direction. We saw no evidence of this and as our customers were asking for a clash tool, we released the capability within Revizto in 2021 to create an integrated collaboration platform.


Martyn Day: Autodesk really confused the Navisworks product, moving some functionality to the cloud and leaving some on the desktop, which meant your data needed to be in two places, depending on what functions you wanted to do. I’ve come across users who had been told that Navisworks was end of life.

Arman Gukasyan: Our philosophy here at Revizto is to listen to what our customers would like, really listen. This is done in a number of ways, with the main one being through our implementation team who have all come from industry. This team encourages our customers to show us their challenges and we aim to develop a solution.

Our clash solution came about from these conversations – a truly collaborative process. Revizto cloud architecture allows everyone involved in the project (no matter where they are based) to view which clash tests are being worked on, however no one else can touch these clash issues until the person working on them finishes. Once finished everyone can view them in real time.

Clash detection is not about creating millions of clashes and stuffing them into an issue tracker. Clash detection is how can I create a clash free model, and then feed only the real clashes into the issue tracker. With other solutions, if you open up your mailbox, you have 1,000 or 2,000 unread messages, you will never get to the bottom. With Revizto everything happens in real time.


Martyn Day: While clash detection is highly asked for and highly valuable to protect against causing millions of dollars of errors on site, the number of people that do clash detection seem to be a loud minority?

Arman Gukasyan: That may be so, but clash detection is part of the process in managing data. Revizto is an integrated collaborative platform where clash detection is a main part of the platform. There may be a loud minority, but it is a powerful minority.


Martyn Day: Moving on to the iPhone version, how much data can you hold on it? Can it be used on an iPad? Is there a memory limitation on your app or the generation of iPhone that you can open?

Arman Gukasyan: It depends on what project you’re opening, of course, which would start at the size of the model. We don’t have a specific limitation. We did a lot of testing on both the iPhone and Android, and we can push the limits all the time with the iPhone.

Android is less powerful due to the operating system taking a lot of the available RAM. Our brand-new mobile app isn’t streaming the data, it’s loading highly optimised data and can open models that other solutions and desktop apps can’t.

The most important thing the mobile app does is that it only renders whatever you see (occlusion culling). When the mobile app first opens it asks you what data you want – only the cached data, old updates, or if you want to see a particular pipe. It can load files which contain hundreds of models, thousands of sheets, even point clouds.


Martyn Day: There are an increasing number of iPads / tablets on building sites, why the bias to phones?

Arman Gukasyan: Not everyone has a tablet, but nearly everyone has a phone. Again, we listened to our customers who wanted to be more productive by using Revizto on their phone, anytime anywhere.


Martyn Day: So what capabilities are missing from it?

Arman Gukasyan: You wouldn’t actually do clash detection on the phone. You would use the phone app to see all the clashes that have been highlighted in the issue tracker.


Martyn Day: With clash detection and now the mobile app rewritten, which areas are you looking to develop next?

Arman Gukasyan: We currently cover the design and build part of the process and have started to focus on the handover and FM (Facility management) stage. Other areas of interest include Digital Twins and Augmented Reality.

A final point from me is to reiterate that we always watch closely the problems our users have and bring a solution with our own twist. I believe that we have made it easy to access your data, invoke collaboration between all project team members and it will be the main focus for us to make the collaboration seamless across the project lifecycle.

The post Revizto reflections appeared first on AEC Magazine.

]]>
https://aecmag.com/collaboration/revizto-reflections/feed/ 0
nima virtual conference to focus on data https://aecmag.com/data-management/nima-virtual-conference-to-focus-on-data/ https://aecmag.com/data-management/nima-virtual-conference-to-focus-on-data/#disqus_thread Wed, 05 Jul 2023 15:06:23 +0000 https://aecmag.com/?p=18030 Conference aims to provide a gateway to better information management within the construction sector

The post nima virtual conference to focus on data appeared first on AEC Magazine.

]]>
Free nima virtual conference aims to provide a gateway to better information management within the construction sector

Data and how to unlock its true value in the construction industry, will be the key theme for nima’s inaugural virtual conference on 2 November 2023.

The event will include a day of online learning with live keynote talks and panel discussions on topics shaping the industry with a focus on data, advancing technologies and sustainability.

“The nima Virtual Conference will bring together inspirational strategists and real-world practitioners, to help turn theory into practice,” said nima chair, Anne Kemp. “The value lies in how data is used rather than simply how much data is available.”

The conference will be held in an immersive environment, where the audience, speakers and exhibitors interact through a virtual platform.

There will be four keynote speakers and four practical learning sessions, all focused on unlocking the value of data and how to use data to meet today’s urgent information management needs.

Registration for the free nima virtual conference is now open.

nima was previously known as the UK BIM Alliance, the independent, not-for-profit organisation. The name was changed in September 2022 to reflect the shift in focus of UK government and industry discussion about digital transformation in the built environment.

The post nima virtual conference to focus on data appeared first on AEC Magazine.

]]>
https://aecmag.com/data-management/nima-virtual-conference-to-focus-on-data/feed/ 0
Defining BIM 2.0 https://aecmag.com/bim/defining-bim-2-0/ https://aecmag.com/bim/defining-bim-2-0/#disqus_thread Fri, 02 Jun 2023 10:54:47 +0000 https://aecmag.com/?p=17777 What happens when disruptive technology arrives?

The post Defining BIM 2.0 appeared first on AEC Magazine.

]]>
Over decades, the BIM software industry has refined the process of continual development and evolution of the various competing tools. With proprietary file formats, subscription, third-party ecosystems and industry standards, what happens when disruptive technology arrives?

66 million years ago, the Earth was hit by a 10km wide asteroid, the Chicxulub impactor, which landed in the Yucatán Peninsula. Humans were blissfully unaware of what came before us until 1842, when British scientist Richard Owen announced his theory on Dinosaurs. Likewise, I often get the feeling that many in our industry don’t know what came before today’s BIM systems, other than 2D drawings and AutoCAD.

In the 1980s, there were previous generations of BIM tools, before the definition and common usage of building information modelling. Considering the compute power available at the time, it really was incredible what Sonata, RUCAPS and GDS were able to do in 2.5D and 3D. However, the multiple asteroids of cost, desktop computing and the ubiquity of cheap 2D CAD quickly turned them into fossils, to be mainly forgotten.

While the creators of some of the previous crop of AEC tools were trying to adapt and survive in the new desktop world, Dr. Jonathan Ingram, creator of the UNIX-based Sonata (at over £100k per seat) was in the process of developing a desktop version, called Reflex but sold out to PTC before widespread commercialisation. This early BIM technology inspired Graphisoft in Hungary and was eventually spun out of PTC and served as the root of Revit.

History repeats. The software industry goes through cycles where success can come from evolution and widespread adoption, only for something disruptive to come along and change the game. In the 1980s mechanical design software world, there were many Unix based design tools, such as CADDS and MEDUSA from Computervision, which had to develop the hardware and the software. Then along came a software company called PTC which developed a parametric modelling tool on the relatively low-cost Sun workstation and PTC eventually put the original pioneers out of business.

In time, Solidworks came along on the Windows PC and almost did to PTC what PTC had done to Computervision. These cycles happen rarely, perhaps every two or three decades, as dominant players stagnate, and new blood enters the market.


Find this article plus many more in the May / June 2023 Edition of AEC Magazine
👉 Subscribe FREE here 👈

The cloud

Looking at the crop of new software that AEC Magazine has covered over the past eighteen months, it’s clear to me that something similar is happening, with a number of technologies and ideas solidifying around fresh approaches.

In past conversations with CEOs of design software, many have identified the shift from desktop applications to cloud applications as being the most likely asteroid to shake up the market – and, based on that, Autodesk is aiming to disrupt itself with Forma, before any other player does.

From my point of view, one of the key reasons why Revit was not rewritten ten years ago was because Autodesk believed that it was moving to the cloud and, therefore, until the base cloud platform had been built, there was no point in rewriting a successful product. But the process of developing that technology took longer than expected.

The delay has opened a chink in the company’s armour, with a handful of cloudbased startups hoping to deliver ‘a faster horse’ and a more collaborative experience. Get used to hearing about tools that want to be the Figma of BIM. (N.B. Figma is a collaborative web application for interface design). We currently have Arcol and Snaptrude vying for this position, with others in the wings.


Snaptrude
Snaptrude includes a bi-directional link with Revit to enable users to ‘seamlessly transfer’ models between the tools

However, the cloud, as a pure market disruptor, doesn’t necessarily seem to be enough to change customers’ workflows today; the increasing power of the desktop and the need for powerful GPUs is still seen as a key resource for today’s AEC designers. Very few cloud-based applications utilise cloud-based GPUs as they are expensive and typical virtual machines used in AWS and other public clouds are not workstation class, so you still need a decent local computer. And, as shown in the in-depth article in our Workstation Special report, cloud workstations play second fiddle to personal workstations when it comes to performance.

There is also a chance that customers might not want to use a pure play cloud BIM application. Graphisoft, the developer of Archicad, has gone for a hybrid approach to cloud, letting customers choose to keep their project data local or on their BIMcloud Software as-a-Service (SaaS) platform. The company has even gone so far as to enable analysis applications to run locally or in the cloud.

Graphisoft has taken this approach as it has concerns over browser security and in response to customers’ appreciation of local hardware and being in control. If Autodesk, with Forma, gets this complex migration to the cloud wrong, Graphisoft will have a USP for its cloud oblivious BIM architecture.

Artificial Intelligence

Cloud is not the only disruptor. While there is a lot of news about artificial intelligence and machine learning, and everybody’s excited about AI image generator Midjourney and the ability for ChatGPT to write your CV or make some in depth recommendations about the best Korean dramas on Netflix, the real disruption has only just started.

Augmenta and Swapp have demonstrated the kinds of capabilities that AI can bring to AEC. Augmenta automatically routes systems like electrical, plumbing and MEP in a Revit model, while Swapp will take a sketch of defined spaces and build a detailed 3D model together with all drawings in the time it takes have lunch. This is not just taking hours off project time but weeks and months.


Augmenta
Augmenta (above) and Swapp (below) have demonstrated the kinds of capabilities that AI can bring to AEC. Augmenta automatically routes systems like electrical, plumbing and MEP in a Revit model, while Swapp will take a sketch of defined spaces and build a detailed 3D model together with all drawings in the time it takes have lunch.

Swapp

 


I would say we’re only years away from having fully automated and checked 2D drawing output. This single phase of a project – the creation of documentation – costs the AEC industry hundreds of millions of dollars per year. The first software company to deliver a reliable automated workflow will make an absolute killing.

The automation of design modelling and documentation will certainly be a sizeable asteroid hitting the market. While this might impact jobs, and take away the drudgery of document preparation, it will even hit software firms whose sales of millions of subscriptions of drawing tools may reduce to a trickle. And because software firms don’t like to go backwards in revenue generation, expect to see new business models emerge to make up for this lack of seat sales. I predict we will see more companies charging by the value of the project, or as a percentage of fees charged.

Data

The AEC industry has long been trapped in silos, unaided by the proprietary file formats of the vendors. As the industry moves to a more cloud-centric approach, files will wane, and project data will be stored in ‘data lakes’. This will make collaboration much easier. Instead of shunting huge files through document management systems, subsets of models and systems of components will be able to be shared dynamically.

While customers and some vendors have made big strides in talking about and utilising open data formats, even big players like Autodesk says its committed to the liberalisation of project data through open APIs. There is, of course, no reason why a project data lake needs to be stored on a vendor’s public cloud. Truly customer-focused vendors will offer capabilities both in public and behind your firewall, and will submit their data lake format to open source.

Fabrication

We are facing a demographic time bomb at a time when we have a huge demand for new builds. The construction industry has been trying and mainly failing at setting up offsite construction facilities, to mass manufacture volumetric buildings and ship them to site.

Should an early adopter be able to remove the time and cost of just producing drawings, that advantage could be used in fee reduction, or taking on more jobs with a smaller team

None of the current generation BIM tools were ever intended to interface to or drive digital fabrication. Then there’s the small problem that architects have never been trained to design for manufacture. While the new generation of BIM tools are embryonic, the developers are all aware that they must cross the divide between architecture and construction.

Other tech

With looming sustainability targets, designers and construction firms will be forced to truly understand how sustainable their designs are and clients will not be able to dismiss sustainable elements to save on costs. The ubiquity of sensors will also eventually mean that data from the real world can be combined with the BIM model, further driving better understanding of performance, with real-world metrics. Next-generation tools need to be fluent in aiding the design of sustainable buildings and infrastructure.

Conclusion

The acceleration of design technology development is clearly happening. With the introduction of new competitors, new foundation technologies and market leading firms such as Autodesk aiming to disrupt themselves by moving to a cloudcentric solution, we are in new territory.

Historically as we have moved through different ages of design software, market leaders and their proprietary formats have managed to lock in and damage open interoperability. We got ‘DWGd’ and ‘RVTd’. With this next generation of tools and a verbal commitment to open data, BIM 2.0 looks like it’s starting off on the right foot this time.

The most disruptive and intriguing part of this technology mix will be what AI does to the industry. Should an early adopter be able to remove the time and cost of just producing drawings, that advantage could be used in fee reduction, or taking on more jobs with a smaller team.

The competitive disadvantage to other firms staying with the current generation of manual tools would be incredibly significant. Therefore, I conclude that technologies which enable this, once proven, will spread like wildfire through the world’s design firms.

BIM 2.0 and what the industry wants from next gen AEC software we key topics up for  discussion at our NXT BLD  and NXT DEV conferences, held at the Queen Elizabeth II Centre in London on 20-21 June 2023.

All of the talks are available to watch free on demand.

NXT BLD 2023 on-demand – register here

NXT DEV 2023 on-demand – register here

At NXT BLD Greg Schleusner, Director of Design Technology, Innovation at HOK talked about data lakes and taking back control of BIM data. The day also featured Adi Shavit the CEO of SWAPP, a company that aims to automate much of the design and drawing phases in BIM. Finch3D also talked AI in architectural design and Senthil Kumar of Slate Technologies gave us his impression of AI’s impact in construction.

At NXT DEV (21 June) we seeked to define what BIM 2.0 is through a series of panel sessions / town hall debates. There were plenty of discussions, including an insghtful presentation from Aaron Perry, AHMM, about exactly how the software technology landscape needs to change to better serve the process of designing and constructing the built environment.

The audience combined practicing AEC firms with technologists, developers and venture capitalists. Leading AEC practionersinformed those creating the next generation of tools, where the pain points and holes in tech stacks are – and what the industry needs next.

NXT DEV also presented an opportunity to meet the founders and see the coolest application developers we have covered over the last year in AEC Magazine — Qonic, Hypar, Augmenta, Blue Ocean AEC, SWAPP, Spaces, Finch3D, Arcol, Snaptrude and others.


The BIM 2.0 wish list

What we think should be in next generation AEC software tools

  • Integration with other technologies such as Artificial Intelligence (AI), machine learning (ML), and the Internet of Things (IoT) to enable more sophisticated analysis, optimisation, and automation of building design, documentation and operation.
  • Enhanced collaboration and data sharing capabilities, including improved interoperability between different software applications and open data standards.
  • Greater emphasis on sustainability, resilience, and lifecycle analysis, allowing designers and builders to create more environmentally friendly and resilient buildings.
  • Advanced visualisation and simulation capabilities, including augmented reality and virtual reality technologies that enable more immersive and interactive design experiences, bring BIM to the world of Meta. • Integration with blockchain or other distributed ledger/ encryption technologies to enable more secure and transparent data sharing and collaboration.
  • Integrating Desktop and Cloud to seamlessly connect project participants and provide new SaaS business models.

The post Defining BIM 2.0 appeared first on AEC Magazine.

]]>
https://aecmag.com/bim/defining-bim-2-0/feed/ 0
Autodesk Forma: a deep dive into the data lake https://aecmag.com/collaboration/autodesk-forma-a-deep-dive-into-the-data-lake/ https://aecmag.com/collaboration/autodesk-forma-a-deep-dive-into-the-data-lake/#disqus_thread Mon, 05 Jun 2023 06:16:52 +0000 https://aecmag.com/?p=17817 Autodesk’s AEC cloud platform has an initial focus on conceptual design, but it will become much more than this. We explore how it handles data

The post Autodesk Forma: a deep dive into the data lake appeared first on AEC Magazine.

]]>
Autodesk boldly told the industry that there would be no second-generation Revit. Instead, it would develop a cloud-based platform for all project data, called Forma. This month the first instalment arrived, with a focus on conceptual design. But Forma will become so much more than this. Martyn Day gets to the heart of the platform and how it handles data

In early May, Autodesk announced the first instalment of its next-generation cloud-based platform, Forma, with an initial focus on conceptual design (see box out at end). The branding and original announcement of Forma was delivered at Autodesk University 2022 in New Orleans. While the name was new, the concept was not.

Back in 2016, the then vice president of product development at Autodesk, Amal Hanspal, announced Project Quantum , the development of a cloud-based replacement for Autodesk’s desktop AEC tools.

In the seven years that followed, progress has been slow, but development has continued, going through several project names, including Plasma,, and project teams. While the initial instalment of Forma could be considered to be just a conceptual design tool — essentially a reworking and merging of the acquired Spacemaker toolset with Autodesk Formit — this launch should not be seen as anything other than one of the most significant milestones in the company’s AEC software history.

Since the early 2000s, Autodesk (and, I guess, a lot of the software development world), decided that the cloud with a Software-as-a-Service (SaaS) business model was going to be the future of computing. Desktop-based applications require local computing power, create files, caches and generate duplicates. All of this requires management and dooms collaboration to be highly linear.

While the initial instalment of Forma could be considered to be just a conceptual design tool — essentially a reworking and merging of the acquired Spacemaker toolset with Autodesk Formit — this launch should not be seen as anything other than one of the most significant milestones in the company’s AEC software history

The benefits of having customers’ data and the applications sat on a single server include seamless data sharing and access to huge amounts of compute power. Flimsy files are replaced by extensible and robust databases, allowing simultaneous delivery of data amongst project teams.

With the potential for customers to reduce capital expenditure on hardware and easy management of software deployment and support, it’s a utopia based on modern computer science and is eminently feasible. However, to move an entire install base of millions from a desktop infrastructure, with trusted brands, to one that is cloudbased, does not come without its risks. This would be the software equivalent of changing a tyre at 90 miles an hour.

The arrival of Forma, as a product, means that Autodesk has officially started that process and, over time, both new functionality and existing desktop capabilities will be added to the platform. Overall, this could take five to ten years to complete.

The gap

Autodesk’s experiments with developing cloud-based design applications have mainly been in the field of mechanical CAD, with Autodesk Fusion in 2009. While the company has invested heavily in cloud-based document management, with services such as Autodesk Construction Cloud, this doesn’t compare to the complexity of creating an actual geometry modelling solution. To create Fusion, Autodesk spent a lot of time and money to come up with a competitive product to Dassault Systèmes’ Solidworks, which is desktop-based. The thinking was that by getting ahead on the inevitable platform change to cloud, Autodesk would have a contender to capture the market. This has happened many times before, with UNIX to DOS and DOS to Windows.

Autodesk has finally created the platform change event that it hoped for, without user demand, but it has come at a time when Revit is going to be challenged like never before

Despite these efforts Fusion has failed to further Autodesk’s penetration of its competitors’ install base. At the same time the former founder of Solidworks, Jon Hirschtick, also developed a cloudbased competitor, called Onshape, which was ultimately sold to PTC. While this proved that the industry still thought that cloud would, at some point, be a major platform change, it was clear that customers were both not ready for a cloud-based future or to leave the current market-leading application.

Years later and all Solidworks’ competitors are still sat there waiting for the dam to burst. Stickiness, loyalty and long-honed skills could mean they will be waiting a long time.

This reticence to move is even more likely to be found in the more fragmented and workflow-constrained AEC sector. On one hand, the pressure to develop and deliver this over ten years would seem quite acceptable. The problem comes with Autodesk’s desktop products that have had a historic low development velocity and a growing vocal and frustrated user base.

Back in 2012, I remember having conversations with C-level Autodeskers, comparing the cloud development in Autodesk’s manufacturing division and wondering when the same technologies would be available for a ‘next-generation’ Revit. With this inherent vision that the cloud would be the next ‘platform’, new generations of desktop tools like Revit, seemed like a waste of resources. Furthermore, Autodesk was hardly under any pressure from competitors to go down this route.

However, I suspect that not many developers at the time would have conceived that it would take so long for Autodesk to create the underlying technologies for a cloud-based design tool. The idea that Revit, in its current desktopbased state, could survive another five to ten years before being completely rewritten for the cloud is, to me, inconceivable.

Angry customers have made their voices clearly heard (e.g. Autodesk Open Letter and Nordic Open Letter Group . So, while Forma is being prepared, the Revit development team will need to do a serious bit of rear-guard action development to keep customers happy in their continued investment in Autodesk’s aging BIM tool. And money spent on shoring up the old code and adding new capabilities is money that’s not being spent on the next generation.

This isn’t just a case of providing enhancements for subscription money paid. For the first time in many decades, competitive software companies have started developing cloud-based BIM tools to go head-to-head with Revit (see Arcol, Snaptrude), Qonic and many others that have been covered in AEC Magazine over the past 12 months).

Autodesk has finally created the platform change event that it hoped for, without user demand, but it has come at a time when Revit is going to be challenged like never before.

The bridgehead

While Forma may sound like a distant destination, and the initial offering may seem inconsequential in today’s workflows, it is the bridgehead on the distant shore. The quickest way to build a bridge is to work from both sides towards the middle and that seems to be exactly what Autodesk is planning to do.

Forma is based on a unified database, which is capable of storing all the data from an AEC project in a ‘data lake’. All your Autodesk favourite file formats — DWG, RVT, DWF, DXF etc. — get translated to be held in this new database (schema), along with those from thirdparties such as Rhino.

The software architecting of this new extensible unified database forms the backbone to Autodesk’s future cloud offering and therefore took a considerable amount of time to define.

Currently, Autodesk’s desktop applications don’t use this format, so on-thefly translation is necessary. However, development teams are working to seamlessly hook up the desktop applications to Forma. With respect to Revit, it’s not unthinkable that, over time, the database of the desktop application will be removed and replaced with a direct feed to Forma, with Revit becoming a very ‘thick client’. Eventually the functionality of Revit will be absorbed into thin-client applets, based on job role, which will mainly be delivered through browserbased interfaces. I fully expect Revit will get re-wired to smooth continuity before it eventually gets replaced.

Next-generation database

One of the most significant changes that the introduction of Forma brings is invisible. The new unified database, which will underpin all federated data, lies at the heart of Autodesk’s cloud efforts. Moving away from a world run by files to a single unified database provides a wide array of benefits, not only for collaboration but also to individual users. To understand the structure and capabilities of Forma’s new database, I spoke with Shelly Mujtaba, Autodesk’s VP of product data.

Data granularity is one of the key pillars of Autodesk’s data strategy, as Mujtaba explained, “We have got to get out of files. We can now get more ‘element level’, granular data, accessible through cloud APIs. This is in production; it’s actually in the hands of customers in the manufacturing space. “If you go to the APS portal (Autodesk Platform Services (formerly Forge), you’ll see the Fusion Data API. That is the first manifestation of this granular data. We can get component level data at a granular level, in real time, as it’s getting edited in the product. We have built similar capabilities across all industries.

“The AEC data model is something that we are testing now with about fifteen private beta customers. So, it is well underway — Revit data in the cloud — but it’s going to be a journey, it is going to take us a while, as we get more and richer data.”

To build this data layer, Mujtaba explained that Autodesk is working methodically around workflows which customers use, as opposed to ‘boiling the whole ocean’. The first workflow to be addressed was conceptual design, based on Spacemaker. To do this, Autodesk worked with customers to identify the data needs and learn the data loops that came with iterative design. This was also one of the reasons that Rhino and TestFit are among the first applications that Autopdesk is focussing on via plug-ins.

Interoperability is another key pillar, as Mujtaba explained, “So making sure that data can move seamlessly across product boundaries, organisational boundaries, and you’ll see an example of that also with the data sheet.”

At this stage, I brought up Project Plasma, which followed on from Project Quantum. Mujtaba connected the dots, “Data exchange is essentially the graduation of Project Plasma. I also led Project Plasma, so have some history with this,” he explained. “When you see all these connectors coming out, that is exactly what we are trying to do, to allow movement of data outside of files. And it’s already enabling a tonne of customers to do things they were not able to do before, like exchange data between Inventor and Revit at a granular level, even Inventor and Rhino, even Rhino and Microsoft Power Automate. These are not [all] Autodesk products, but [it’s possible] because there’s a hub and spoke model for data exchange.

“Now, Power Automate can listen in on changes in Rhino and react to it and you can generate dashboards. Most of the connectors are bi-directional. Looking at the Rhino connector you can send data to Revit, and you can get data back from Revit. Inventor is now the same (it used to be one directional, where it was Revit to Inventor only) so you can now take Inventor data and push it into Revit.”


Find this article plus many more in the May / June 2023 Edition of AEC Magazine
👉 Subscribe FREE here 👈

Offline first

In the world of databases there have been huge strides to increase performance, even with huge datasets. One only has to look to the world of games and technologies like Unreal Engine.

One of the terms we are likely to hear a lot more in the future is ECS (Entity Component Systems). This is used to describe the granular level of a database’s structure, where data is defined by component, in a system, as opposed to be just being a ‘blob in a hierarchy data table’.

I asked Mujtaba if this kind of games technology was being used in Forma. He replied, “ECS is definitely one of the foundational concepts we are using in this space. That’s how we construct these models. It allows us extensibility; it allows us flexibility and loose coupling between different systems. But it also allows us to federate things. This means we could have data coming from different parties and be able to aggregate and composite through the ECS system.

“But there’s many other things that we have to also consider. For example, one of the most important paradigms for Autodesk Fusion has been offline first — making sure that while you are disconnected from the network, you can still work successfully. We’re using a pattern called command query responsibility separation. Essentially, what we’re doing is we’re writing to the local database and then synchronising with the cloud in real time.”

This addresses one of my key concerns that, as Revit gets absorbed into Forma over the next few years, users would have to constantly be online to do their work. It’s really important that team members can go off and not be connected to the Internet and still be able to work.

Mujtaba reassuringly added, “We are designing everything as offline first, while, at some point of time, more and more of the services will be online. It really depends on what you’re trying to do. If you’re trying to do some basic design, then you should be able to just do it offline. But, over time, there might be services such as simulation, or analytics, which require an online component.

“We are making sure that our primary workflows for our customers are supported in an offline mode. And we are spending a lot of time trying to make sure this happens seamlessly. Now, that is not to say it’s still file-based; it’s not file based. Every element that you change gets written into a local data store and then that local data source synchronises with the cloud-based data model in an asynchronous manner.

“So, as you work, it’s not blocking you. You continue to work as if you’re working on a local system. It also makes it a lot simpler and faster, because you’re not sending entire files back and forth, you’re just sending ‘deltas’ back and forth. What’s happening on the server side is all that data is now made visible through a Graph QL API layer.

“Graph QL is a great way to federate different types of data sources. Because we have different database technologies for different industries, because they almost always have different workflows. Manufacturing data is really relationship heavy, while AEC data has a lot of key value pairs (associated values and a group of key identifiers). So, while the underlying databases might be different, on top of that we have a Graph QL federation layer, which means from a customer perspective, it all looks like the same data store. You all you get a consistent interface.”

Common and extensible schemas One of the key benefits of moving to a unified database was that data from all sorts of applications can sit side by side, level 200 / 300 AEC BIM model data with Inventor fabrication-level models. I asked Mujtaba to provide some context to how the new unified database schema works.

“It’s a set of common schemas, which we call common data currencies — and that spans across all our three industries,” he explained. “So, it could be something basic, like basic units and parameters, but also definitions or basic definitions of geometry. Now, these schemas are semantically versioned. And they are extensible. So that allows other industry verticals to be able to extend those schemas and define specific things that they need in their particular context. So, whenever this data is being exchanged, or moved across different technologies under the covers, or between different silos, you know they’re using the language of the Common Data currencies.

Forma in the long-term is bold, brave and, in its current incarnation, the down payment for everything that will follow afterwards for Autodesk. However, it has been a long time coming

“On top of that data model, besides ECS, we also have a rich relationship framework, creating relationships between different data types and different level of details. We spend a lot of time making sure that relationship, and that that way of creating a graph-like structure, is common across all industries. So, if you’re going through the API, if you’re looking at Revit data and how you get the relationship, it will be the same as how you get Fusion relationships.”

To put it in common parlance, it’s a bit like PDF. There is a generic PDF definition, which has a common schema for a whole array of stuff. But then there’s PDF/A, PDF/E and X, which are more specific to a type of output. Autodesk has defined the nucleus, the schema for everything that has some common shared elements. But then there are some things that you just don’t need if you’re in an architecture environment, so there will be flavours of the unified database that have extensions. But convergence is happening and increasingly AEC is meeting manufacturing in the offsite factories around the world.

Mujtaba explained that because of convergence Autodesk is spending a lot of time on governance of this data model trying to push as much as possible into the common layer.

“We’re being very selective when we need to do divergence,” he said. “Our strategy is, if there is divergence, let it be intentional not accidental. Making sure that we make a determined decision that this is the time to diverge but we may come back at some point in time. If those schemas look like they’re becoming common across industries, then we may push it down into the common layer.

“Another piece of this, which is super important, is that these schemas are not just limited to Autodesk. So obviously, we understand that we will have a data model, and we will have schemas, but we are opening these schemas to the customers so they can put custom properties into the data models. That has been a big demand from customers. ‘So, you can read the Revit data, but can I add properties? For example, can I add carbon analysis data, or cost data back into this model?’

“Because customers would just like to manage it within this data model, custom schemas are also becoming available to customers too.”

Back in the days of AutoCAD R13, Autodesk kind of broke DWG, by enabling ARX to create new CAD objects that other AutoCAD users could not open or see. The fix for this was the introduction of proxy objects and ‘object enablers’ which had to be downloaded, so ARX objects could be manipulated. To this backdrop, the idea that users can augment Forma’s schema was a concern. Mujtaba explained, “You can decide the default level of visibility to only you can see it, but you might say this other party can see it, or you can say this is universally visible, so anyone can see it.

“To be honest, we are learning so we have a private beta running within the Fusion space of extension data extensibility with a bunch of customers, because this is a big demand from our customers and partners. So, we are learning. “The visibility aspect came up, and, in fact, our original visibility goal was only ‘you can see this data’. And then we learned from the customers that no, they would selectively want to make this data visible to certain parties or make it universally visible.

“We are continuing to do very fast iterations. Now instead of going dark for several years and trying those things and trying to boil the ocean, we are saying, ‘here’s a workflow we’d like to solve. Let’s bring in bunch of customers and let’s solve this problem. And then move to the next workflow’. So that’s why you see things coming out at a very rapid pace now, like the data exchange connectors, and the data APIs”.

It’s clear that the data layer is still to some degree in development, and I don’t blame Autodesk for breaking it down into workflows and connecting your tools that are commonly used. This is a vast development area and an incredibly serious undertaking on behalf of Autodesk’s development team. It seems that the strategy has been clearly identified and the data tools and technologies decided upon. In the future, I am sure files will still play a major role in some interoperability workflows, but with APIs and an industry interest in open data access, files may eventually go the way of the dodo.

Conclusions

Forma in the long-term is bold, brave and, in its current incarnation, the down payment for everything that will follow afterwards for Autodesk. However, it has been a long time coming. In 2016, the same year that Quantum was announced at Autodesk University, Autodesk also unveiled Forge, a cloud-based API which would be used to break down existing products into discreet services which could be run on a cloudbased infrastructure.

Forge was designed to help Autodesk put its own applications in the cloud, as well to help its army of third-party developers build an ecosystem. Forge recently was rebranded Autodesk Platform Services (APS). I actually don’t know of any other software generation change where so much time and effort was put into ensuring the development tools and APIs were given priority. The reason for this is possibly because, as in the past, these tools were provided by Microsoft Foundation Classes, while for the cloud Autodesk is having to engineer everything from the ground up.

While this article may appear a bit weird, as one might expect a review of the new conceptual design tools, for me the most important thing about Forma is the way that it’s going to change the way we work, the products we use, where data is stored, how data is stored, how data is shared and accessed, as well as new capabilities, such as AI and machine learning. Looking at Forma from the user-facing functional level that has been exposed so far, does not do it justice. This is the start of a long journey.

Autodesk Forma is available through the Architecture, Engineering & Construction (AEC) Collection. A standalone subscription is $180 monthly, $1,445 annually, or $4,335 for three years. Autodesk also offers free 30-day trials and free education licences. While Forma is available globally, Autodesk will initially focus sales, marketing and product efforts in the US and European markets where Autodesk sees Forma’s automation and analytical capabilities most aligned with local customer needs.


From Autodesk SpaceMaker to Autodesk Forma

Autodesk Forma
Revit and Forma integration

Over the last four years there have been many software developers aiming to better meet the needs of conceptual design, site planning and early feasibility studies in the AEC process. These include TestFit, Digital Blue Foam, Modelur, Giraffe and Spacemaker to name but a few. The main take up of these tools was in the property developer community, for examining returns on prospective designs.

Autodesk Forma
Rapid operational energy analysis

In November 2020, just before Autodesk University, Autodesk paid $240 million for Norwegian firm Spacemaker and announced it as Autodesk’s new cloud-based tool for architectural designers. Autodesk had been impressed with both the product and the development team and rumours started coming out that the team would be the one to develop the cloud-based destination for the company’s AEC future direction.

This May, Autodesk launched Forma, which is essentially a more architectural biased variant of the original application and, at the same time, announced it was retiring the Spacemaker brand name. This created some confusion that Forma was just the new name for Spacemaker. However, Spacemaker’s capabilities are just the first instalment of Autodesk’s cloudbased Forma platform, the conceptual capabilities, which will be followed by Revit.

Autodesk Forma
Microclimate analysis

Forma is a concept modeller which resides in the cloud but has tight desktop Revit integration. Designs can be modelled and analysed in Forma and sent to Revit for detailing. At any stage models can be sent back to Forma for further analysis. This twoway connection lends itself to proper iterative design.

At the time of acquisition, Spacemaker’s geometry capabilities were rather limited. The Forma 3D conceptual design tool has been enhanced with the merging of FormIT’s geometry engine (so now supports arcs circles, splines), together with modelling based on more specific inputs, like number of storeys etc. It’s also possible to directly model and sculpt shapes. For those concerned about FormIT, the product will remain in development and available in the ‘Collection’.

Autodesk Forma
Daylight potential analysis

One of the big advantages hyped by Autodesk is Forma’s capability to enable modelling in the context of a city, bringing in open city data to place the design in the current skyline. Some countries are served better than others with this data. Friends in Australia and Germany have not been impressed with the current amount of data available, but I am sure this is being worked on.

Looking beyond the modelling capabilities, Forma supports real-time environmental analyses across key density and environmental qualities, such as wind, daylight, sunlight and microclimate, giving results which don’t require much deep technical expertise. It can also be used for site analysis and zoning compliance checking.

At time of launch, Autodesk demonstrated plug-ins from its third-party development community, with one from TestFit demonstrating a subset of its car parking design capability, and the other from Shapedriver for Rhino.

Talking to the developers of these products, they were actively invited to port some capabilities to Forma. Because of the competitive nature of these products, this is a significant move. Rhino is the most common tool for AEC conceptual design and TestFit was actually banned from taking a booth at Autodesk University 2022 (we suspected because TestFit was deemed to be a threat to Spacemaker). In the spirit of openness, which Autodesk is choosing to promote, it’s good to see that wrong righted. By targeting Rhino support, Autodesk is clearly aware of how important pure geometry plays to a significant band of mature customers.

The post Autodesk Forma: a deep dive into the data lake appeared first on AEC Magazine.

]]>
https://aecmag.com/collaboration/autodesk-forma-a-deep-dive-into-the-data-lake/feed/ 0
Hypar: text-to-BIM and beyond https://aecmag.com/ai/hypar-text-to-bim-and-beyond/ https://aecmag.com/ai/hypar-text-to-bim-and-beyond/#disqus_thread Tue, 21 Mar 2023 17:00:00 +0000 https://aecmag.com/?p=17192 Could Hypar help users sidestep much of the detail modelling / drawing production associated with BIM 1.0?

The post Hypar: text-to-BIM and beyond appeared first on AEC Magazine.

]]>
Could a small software firm help users to sidestep much of the detail modelling and drawing production associated with BIM 1.0? Martyn Day reports on Hypar, a company looking to do exactly that

Imagine a computer aided design system that can model a building from a written definition. The user simply types their request into a chat box: ‘A two-storey retail building, and 14 storeys of residential, arranged in an L shape.’ In seconds, a model appears, conforming to the user’s exact requirements, complete with surrounding site conditions and including structural elements, facades and condominium layouts.

To refine the results, the user just adds to the descriptive text. Different floors, for example, might be defined by different text commands.

This is where BIM 2.0 is headed. In many ways, it’s already there.



Founded in 2018, Hypar.io is the brainchild of two seasoned industry veterans: Anthony Hauck and Ian Keough. Hauck was the head of Revit product development and spearheaded Autodesk’s generative design development. Keough is fondly known as the ‘father of Dynamo’, a visual programming interface for Revit.

The pair are widely respected in the industry for their work on developing some of the most popular programmatic design tools used by architects today. When it comes to design tool development, they have been there, done that and bought the T-shirt.

At Autodesk, Hauck and Keough took part in the development of Project Fractal, a tool compatible with Dynamo for generating multiple design solutions according to a customer’s own design logic. It was used in space and building layout design, optimising room placement and structure and generating options that meet core specifications.

This led to the eventual emergence of Hypar — a self-contained, web-based cloud platform and API, which executes code (in Python and C#) to swiftly generate hundreds or even thousands of designs based on design logic. Users can preview resulting 3D models on both desktop and mobile, along with analytics data.


Hypar.io
Hypar is designed to swiftly generate hundreds or even thousands of designs based on design logic

Hypar’s special feature is that it provides a common user interface that isn’t reliant on heavy data sets, in the way that many of today’s BIM tools are, while also supporting open IFC (Industry Foundation Class) files.

Additionally, it is open to communities of tool developers who may wish to share or sell algorithms, or benefit from Hypar’s suite of ever-evolving tools and growing number of open-source resources. This makes it a popular choice for generative tools development. These generative tools can then be used by non-programming designers to experiment with different design variations with a single click. Output from Hypar can also be connected to other processes through commonly used formats and scripts.

The core Hypar team now numbers twelve people and the company has raised a couple of rounds of seed funding, totalling over $2.5 million, from AEC specialist VC firm Building Ventures and Obayashi Construction of Japan.

On a mission

“Our mission from the beginning, and still to this day, is to deliver the world’s building expertise to realise better buildings.” explains Keough. However, he adds, there have been twists and turns along the way.

“When we started Hypar, I thought that, because I came out of the computational design community with Grasshopper and Dynamo, we would sell to computational designers. It turned out that, for a lot of different reasons that I’ve been bloviating about on Twitter and elsewhere, we had built ourselves into a cul-de-sac,” he says.

“We had our own visual programming languages; we were kind of off on our own spur of computational history, which the AEC world had created for itself. Anthony and I identified that as a dead end.”


Hypar.io
A starting point for Text-to-BIM

The challenge with selling into architecture firms, he continues, is that they are already highly saturated with Grasshopper and Revit. Any newcomer has to fight against this deeply entrenched competition.

“What we recognised is that from one project to the next, programmers couldn’t take the logic that they had created, representing some sort of automation for the purpose of the project, and turn that into a repeatable system that could be used from one project to the next, to the next,” Keough continues. The dream with Dynamo Package Manager, he says, was to make a routine and share this back.

“The problem was, all of those systems — Rhino, Grasshopper, Dynamo — were built on top of hero software, but the underlying premise of BIM was flawed.”

Hauck and Keough’s thinking is that while you can automate a certain amount on top of a BIM model, you’re essentially still automating on top of a flawed premise. “What ended up happening was Dynamo was used very effectively for automating the renaming of all the sheets in drawing sets, instead of automating the creation of a specific, novel type of structural system,” he says.

“Even in the cases where people did do that, you couldn’t then take that structural system generator that you had created, and easily couple it with somebody else’s mechanical system generator, or somebody else’s facade system generator. These ‘janky’ workflows had everybody writing everything out to Excel, then reading it from another script.”

Systemic thinking

Analysing the problem alongside customers from the building product manufacturing world, Hauck and Keough decided to “climb the ladder of abstraction”, he says, going beyond the BIM workflow where customers build things with beams and columns, manually, and instead, thinking systemically.

“The product manufacturing partners made this very clear — the value of systemic thinking — because they have highly configurable systems, for which they have experts sitting, pouring over PDFs, measuring things, looking up specifications in their product books, to achieve fully designed systems, to make a sale,” he says.

Every building product manufacturer who came to Hauck and Keough had this same problem. From 2019 to 2022, the pair underwent a shift in thinking, from seeing buildings as projects to instead seeing buildings as products and as assemblies of systems that work together.

Says Keough: “Why do clashes exist? Clashes exist because everyone sits in silos and they operate without interacting with each other. They don’t see what each other has been doing for a couple of weeks, until they merge the Revit models together. Then they find that these components run into other geometry. We need to make systems that can interoperate with each other and understand each other.

Text-to-BIM

So what’s happening when a request is entered in the chat box? Behind the scenes, Hypar is magically mapping the natural language input (for example, ‘2-storey parking garage’) onto Hypar parametric functions. This isn’t AI, but AI informed through natural language input.

Not only does it create the architectural model, but also flipping to a structure view will reveal the steelwork skeleton structure that has also been modelled. Carrying on modelling, we might add a glass façade, a 10-storey residential L-shape tower. It’s possible to section and view an interior model, complete with residence layouts. All this takes seconds.

To change a fully detailed building option in Revit from an L-shaped building to a U-shaped building might typically take three weeks of work. In Hypar, it takes a couple of seconds — not bad! On top of this, all sessions can be shared as collaborative workflows.

Contrary to popular belief, Hypar is not based on Grasshopper or Dynamo. Instead, it is built from the ground up, from the geometry kernel to its generative logic, to do everything itself. The reason for this is because, as a cloud application, it runs on microservices in the cloud and could not be created using historical desktop code. The results are fairly rectilinear, as the software does not support NURBS yet, although there is a tight integration with Grasshopper, where Grasshopper scripts can be turned into Hypar functions.

All of Hypar’s BIM elements are defined using the JSON schema. This is very similar to how Speckle (https://speckle.systems) defines entities that stream across the Speckle Server. Ultimately, it has the ability to export to IFC, but IFC is not native to the platform itself.

Regarding Revit support, Keough explains, “We have really rich support for Revit, in both import and export. We knew we had to do that from Day One, because that’s where people are and that’s where people are going to be for a while, at least in the production of documents.”

That said, Hypar is working with a lot of customers who use Revit for now — not because it’s the perfect system, but because it’s what they have at their disposal, he says. Oftentimes, they’re building functions on Hypar to export a particular type of panel drawing, things that are really hard, if not impossible to do in Revit without some sort of accessory automation, like Dynamo.”


Hypar.io
Design in the context of surrounding site conditions

Over time, as buildings increasingly move to prefabrication and modular construction, and the primacy of drawings is reduced, he says, there will be a corresponding increase in the need for custom types of drawings and outputs to robotic fabrication. “And we’re vectoring to have Hypar on a trajectory to meet those needs,” he says.

One of the biggest issues with BIM 1.0 was that it failed to cross the chasm from scaled drawings to 1:1 fabrication models. Hypar is designed to represent data at multiple scales. As Keough puts it: “We have customers working at every level of detail, from design feasibility all the way through to construction. We have customers who are actually making building construction drawings and specific types of fabrication drawings out of Hypar because, just like the systems that generate the actual building stuff, structural systems, the systems for generating those visualisations are just another sort of extension on top of Hypar.”

Recently at the Advancing Prefabrication conference, Hypar announced the integration work it has carried out with DPR Construction in the US, in order to automate the layout of drywall. “Imagine that, as an architect draws, a wall generates all the layout drawings, and all of the pre-cut fabrication drawings for off-site fabrication of the individual drywall pieces, as well as the stockpiling information about exactly how those pieces of pre-cut drywall are going to be stockpiled on site,” says Keough.

Imagine a CAD system that can model a building from a written definition. The user simply types their request into a chat box and in seconds, a model appears

But with increased detail comes performance issues — so how does Hypar cope with large amounts of data? He responds that the size and complexity of workflows that customers are trusting to Hypar are beginning to test the system. “This is a natural progression, when you build any kind of software,” he says. “When Anthony was building Revit, back in the day, and when I was building Dynamo, there’s always a point at which your customers start using this thing you created for larger and larger and larger models.”

This means that one of the things he and Hauck need to consider very carefully is performance. “If we are inviting all these people to play in this environment, and we can represent all these things with this level of detail, then the compute and your interaction as a user within that compute must operate at the level of performance that historically you’re used to. That’s going to be a challenge, because we’re going to be representing a lot more on this platform.”

Future of building

Our conversation then moves on to the future of building — and how buildings are becoming products. As Keough sees it, a building is comprised of materials and components from a bunch of different manufacturers, just like a laptop. All of these systems must fit tightly and snugly into a clearly defined chassis. From fans to motherboards, the interfaces between them are agreed upon and codified, in order to allow that to happen.

“That’s how buildings are going to start being delivered — and there is not a software out there on the market right now that thinks of buildings in that way,” says Keough. “We all still think of buildings as a big muddy hole in the ground that we fill with sticks and bricks. It’s going to be software like Hypar and others out there which are starting to evolve to think of buildings systemically, getting the system logic and interfaces with other systems in the code. There is a future in which clash coordination is not the way that we coordinate any longer, because it’s not required, because all the systems in a building understand where each other are and really understand how to adapt around each other.”

The demise of drawings

I tell Keough that I am hearing from more AEC firms that they want to move away from producing drawings and get better automated output. It’s something he hears a lot, too: “Increasingly our customers are asking for drawingless workflows.” His colleague Anthony has a “wonderfully concise and pithy” way of describing this problem, he adds. “He says that, for many of the people, Revit is a tax. They do the Revit thing, because they’re in some way required to do it. They have to deliver it.”

Take, for example, virtual design and construction, or VDC, the act of turning a design model into a constructible object. It typically encompasses millions of hours of labour, producing increasingly detailed versions of models, so that firms can coordinate tasks such as putting studs in a wall. But as DPR Construction has demonstrated, a very large contractor can use Hypar to automate the layout of all of its drywall.


Hypar.io
Hypar can generate structural models as well as architectural model

“Millions of square feet of drywall, the most mundane kind of thing in the world — but DPR is turning this into a highly optimised, efficient process, which starts with CNC cutting every single piece of drywall, then robotic layout on a slab,” he says. “If you are CNC cutting all the drywall and you have an automation engine that will do all of that for you based on the architect’s model, it will automate everything when the architect model changes as well. Why do you need drawings?”

Drawing-less building delivery is already possible today, says Keough. Some firms have already embraced it. “And I think if you swim even further back upstream, you actually start asking the questions that we’re asking. If you go to model-based delivery, but you don’t take a huge amount of work out of the process, if all you do is go to model-based delivery, then why are people still manually building those models?”

A faster horse?

Every new company emerges because someone is banging their head against a wall in frustration with an existing process, Keough believes. Hypar is no different, he says. “It came out of Anthony and I banging our collective heads against the wall and deciding that the industry needed something better. ‘There will be drawings’ is the first assumption, not ‘There will be a building’.

As Hypar stretches from concept to fabrication, it has the potential to replace BIM 1.0 workflows with highly tailored solutions, but more importantly, reduce workloads from weeks to seconds

But with start-ups such as Snaptrude emerging, with the goal of taking on the BIM market leaders, what does Keough think about software that sticks to BIM 1.0 workflows, but aims to speed things up? For him, it’s just a “faster horse”, rather than anything truly revolutionary, he says.

“I think there should be some faster horses. There’s a natural sort of regenerative cycle that needs to happen in software where people build the next version of, say, SketchUp. Even though SketchUp is amazing, maybe somebody can innovate on that just a little bit. But I think the next generation of tools in this industry is about the automation of the generation of building systems, and the encoding of expertise that’s carried around in the heads of architects and engineers right now as software.”

In a world where software tools fully detail most building models and automatically produce drawings, one has to wonder what happens to all the BIM seats out there. Will firms only need a few seats to do the work provided by thousands of seats in the past? There is already a slow shift to different charging models, like per project, or based on the value of projects. If less software is needed, surely the price will go up?

“At the point at which that happens, the cost of software will, I think, necessarily need to increase, yes, because there will be fewer seats sold,” agrees Keough. “But it will also often open all kinds of opportunities for software to be sold by value. And by that, I mean right now, for instance, we’re working with a lot of building product manufacturers. We automate the generative logic around which a building product manufacturer’s system is designed.”

For these customers, the value proposition is that they sell more of their product. That means that the value proposition for Hypar must also be more of their product sold, too. “We make more money, because they use the software as part of that sale. I try and figure out a number that works that you’ll pay us, and it’s the highest number that I can get you to pay us every year. Yes, we’ll see all kinds of interesting effects in the marketplace of software becoming generative and some of those will be the actual business models of the software.”

Conclusion

Over the last year, AEC Magazine has focused on highlighting small new innovators working to disrupt established BIM workflows and drive big jumps in productivity. The common thread in nearly all next-generation tools is that developers are looking to automate 3D geometry creation and provide some level of detailing and drawing production. A number are even looking to a time when the industry dispenses with drawings all together. These efforts typically have a sweet spot — the ‘bread and butter’, rectilinear, everyday buildings such as offices, residential, retail, hospitals, schools and so on.

From what I have seen so far, Hypar is aiming to potentially save weeks, if not months, of time, with far-reaching consequences for the industry, in terms of software usage (how many seats of BIM tools are required), how the industry bills its clients, and how much software companies bill us.

Hypar is the culmination of all the years of industry experience that Keough and Hauck have collectively amassed, with its blend of expressive programming and ease of use. As it stretches from concept to fabrication, it has the potential to replace BIM 1.0 workflows with highly tailored solutions, but more importantly, reduce workloads from weeks to seconds.

Individual licences of Hypar are just $79 per month, per user, with discounts available for customers buying groups of five licences or entering into Enterprise deals.


Keough on IFC

Hypar’s Ian Keough (@ikeough) is always good for an interesting tweet on software design and BIM. He also has occasional comments to make about IFC and the inherent problems it brings to the table. We asked Keough about his views on IFC.

Hypar.io
Ian Keough

“Going way back to the beginning of Hypar, we always thought that the sort of open standards and protocols that we have in our industry are severely lacking,” he tells us.

“The standards that we develop should be accessible to anybody with an even basic knowledge of computation. It shouldn’t require Autodesk’s resources or Trimble’s resources or anyone else’s resources to be able to use the open data standards that we have in our industry.”

He’s a software programmer with 10 to 15 years of experience now, he continues, “and I still find IFC almost impossible to use for any day-to-day sort of thing. A lot of what we’ve done, like the opensource library that sits at the heart of Hypar, just makes that stuff easier and more accessible.


Main image: Text-to-BIM: Hypar can model buildings from a written definition


Find this article plus many more in the March / April 2023 Edition of AEC Magazine
👉 Subscribe FREE here 👈

The post Hypar: text-to-BIM and beyond appeared first on AEC Magazine.

]]>
https://aecmag.com/ai/hypar-text-to-bim-and-beyond/feed/ 0
AEC May / June 2023 Edition out now https://aecmag.com/technology/aec-may-june-2023-edition-out-now/ https://aecmag.com/technology/aec-may-june-2023-edition-out-now/#disqus_thread Tue, 06 Jun 2023 19:29:05 +0000 https://aecmag.com/?p=17943 Defining BIM 2.0, A deep dive into Autodesk Forma, Workstation Special Report, plus lots more

The post AEC May / June 2023 Edition out now appeared first on AEC Magazine.

]]>

We kick off summer with an incredible bumper edition of AEC Magazine, available to view now, free, along with all our back issues.

Subscribe to the digital edition free + all the latest AEC technology news in your inbox, or take out a print subscription for $49 per year (free to UK AEC professionals).

What’s inside the May / June edition?

Defining BIM 2.0
With proprietary file formats, subscription, third-party ecosystems and industry standards, what happens when disruptive technology arrives?

Autodesk’s AEC cloud platform has arrived with an initial focus on conceptual design. But Forma will become much more than this. We explore how it handles data.
 
NXT BLD and NXT DEV conference preview
On 20-21 June in London, AEC Magazine’s thought provoking conferences will give a glimpse into the future of AEC technology and software development.
 
It’s time for a unified platform that supports the interests and activities of all stakeholders in a construction project, and where design and fabrication are linked
 
While construction struggles to achieve the same advancements in productivity from technology as other sectors, there remains tremendous potential for change
 
The scale and complexity of Minnucci Associates’ Central Station project gives a compelling glimpse into the future of facilities management
 
Over 40 pages of content dedicated to the very latest workstation technology for AEC workflows, from desktop to cloud

The post AEC May / June 2023 Edition out now appeared first on AEC Magazine.

]]>
https://aecmag.com/technology/aec-may-june-2023-edition-out-now/feed/ 0
OpenSpace BIM+ brings BIM workflows to site https://aecmag.com/construction/openspace-bim-brings-bim-workflows-to-site/ https://aecmag.com/construction/openspace-bim-brings-bim-workflows-to-site/#disqus_thread Fri, 21 Jul 2023 10:45:17 +0000 https://aecmag.com/?p=18135 Software can be used to compare as-built conditions to design intent

The post OpenSpace BIM+ brings BIM workflows to site appeared first on AEC Magazine.

]]>
Software can be used to compare as-built conditions to design intent

OpenSpace, a specialist in 360° reality capture and AI-powered analytics, has introduced OpenSpace BIM+, a suite of 3D tools designed to accelerate BIM and field coordination workflows.

OpenSpace BIM+ is an add-on product to OpenSpace Capture, for ‘fully documenting’ construction sites. It includes BIM analysis tools, BIM coordination tools and offers ‘seamless’ model management

For BIM analysis, the software allows site teams to work in the BIM model and compare that to what they are building. Any project team member can click and navigate to saved views in the BIM model, review model elements overlayed within the 360° photos, and access the model from any iPhone, iPad, or Android device when offline.

Point cloud data can also now be stored, viewed, analysed, and shared in OpenSpace from mobile and terrestrial laser scanners and OpenSpace’s 3D Scan feature, bringing reality capture data to one central location.

For BIM coordination, the software includes features designed to help teams quickly solve problems by aligning design intent with on-site conditions to speed up responses, approvals, and project schedules. OpenSpace Capture’s field note feature is now part of BIM workflows with OpenSpace BIM+, allowing field teams to add detailed images and notes during site walks or back at their desks. BCF file export is also included, enabling streamlined workflows with leading BIM coordination tools like Revizto, Navisworks, and BIM Track.

“We recognised that many of our customers were seeking more powerful, yet easy-to-use BIM tools from OpenSpace,” shared Neel Sheth, vice president of product at OpenSpace. “Our customers’ BIM usage in OpenSpace was outpacing the industry’s overall rate of BIM adoption. That told us that OpenSpace might be able to help drive BIM utilisation for those teams that are adopting and investing in BIM.

“With OpenSpace BIM+, we have partnered with VDC leaders and their project teams to help them work with BIM as easily as navigating a Google Street View.”

The post OpenSpace BIM+ brings BIM workflows to site appeared first on AEC Magazine.

]]>
https://aecmag.com/construction/openspace-bim-brings-bim-workflows-to-site/feed/ 0
Grand Central – a glimpse into the future of FM https://aecmag.com/cafm/grand-central-a-glimpse-into-the-future-of-fm/ https://aecmag.com/cafm/grand-central-a-glimpse-into-the-future-of-fm/#disqus_thread Fri, 02 Jun 2023 06:21:55 +0000 https://aecmag.com/?p=17808 A closer look at the scale and complexity of Minnucci Associates’ Central Station project

The post Grand Central – a glimpse into the future of FM appeared first on AEC Magazine.

]]>
The scale and complexity of Minnucci Associates’ BIM pilot project at Naples Central Station gives a compelling glimpse of what better facilities management might look like in the future for other organisations

Opening out onto the city’s imposing and historic Piazza Garibaldi, there has been a rail way station on the site occupied by Naples Central Station (Napoli Centrale) since the mid-nineteenth century. The current station was designed and built in the 1950s and today handles around 400 trains per day and 150,000 passengers, making it the sixth largest train station in Italy for passenger flow.

It is also the site of a pioneering BIM pilot project, exploring the use of BIM for facility management on a massive scale. The BIM manager for this project was Minnucci Associates, an engineering company based in Rome, Anguillara Sabazia and Milan, working on behalf of the station owner, Rete Ferroviaria Italiana (RFI), which is responsible for the management and safety of railway traffic on the entire national network, including tracks, stations and installations.

Faced with the need to develop new buildings while simultaneously keeping existing buildings and equipment in good condition, RFI initiated the project with a view to capturing the station’s assets. Through surveying and modelling, a digital twin would be created and then converted to an open and collaborative format, supporting integration with RFI’s facilities management system in order to improve the whole-life value of assets.

Grand Central Station, Naples
The site survey resulted in a massive 380 GB point cloud

A heavyweight pilot

For the team at Minnucci Associates, this challenge represented one of the most data-heavy projects it had ever worked on. To give an idea of the complexity involved, Naples Central Station is spread across five buildings, covering around 400,000 square feet and containing some 12,500 components subject to maintenance. These components include electrical, HVAC, hydraulics and vital safety equipment.

The main challenge was the vast size of the model. Graphisoft BIMcloud came to our aid, as we were able to divide the survey data and the resulting model into federated files, then remerge later either within Archicad, or using the exported IFC models Daniele Piccirillo, BIM Manager

The work began with a survey of the site, using laser scanners and orbital pictures, resulting in a massive point cloud of 380 GB in size. From there, the team at Minnucci Associates developed a digital twin of the station, including equipment needing maintenance.

The station was modelled by comparing and combining survey data, existing drawings and census outcome, with all the input data and output models organised in a common data environment (CDE).

Graphisoft’s Archicad was the chosen BIM authoring tool. In total, 44 models were federated and then imported into Solibri in IFC format. By mapping the BIM authoring tool with IFC open standards, Minnucci Associates was able to create automated workflows and instant asset recognition.

sing Graphisoft’s BIMcloud, the team organised survey data and models into a single, catalogued database. Formalised as an BIM execution plan, the RFI team was able to approve changes and request information remotely throughout the entire process. This system also allowed them to reincorporate individual files, both in Archicad’s authoring model and later in IFC models. Today, on-site tasks are supported through the use of a mobile app directly connected to the CDE.

Scale and complexity

“The main challenge was the vast size of the model,” says Daniele Piccirillo, BIM Manager at Mannucci Associates. “Graphisoft BIMcloud came to our aid, as we were able to divide the survey data and the resulting model into federated files, then remerge later either within Archicad, or using the exported IFC model.”

Given the scale of the site, the complexity of the models and the numerous processes supported, this is a pioneer project for the use of BIM for asset and maintenance management. It also offers a compelling glimpse of how improved facilities management might look for organisations across a wide range of industries. A digital twin approach could work for many.


Find this article plus many more in the May / June 2023 Edition of AEC Magazine
👉 Subscribe FREE here 👈

The post Grand Central – a glimpse into the future of FM appeared first on AEC Magazine.

]]>
https://aecmag.com/cafm/grand-central-a-glimpse-into-the-future-of-fm/feed/ 0