AR Archives - Rhino 3D https://rhino3d.co.uk/tag/ar/ Rhino 3D Thu, 26 May 2022 12:40:01 +0000 en-GB hourly 1 https://wordpress.org/?v=6.8.2 https://rhino3d.co.uk/wp-content/uploads/2018/03/cropped-rhino3d-site-icon-32x32.png AR Archives - Rhino 3D https://rhino3d.co.uk/tag/ar/ 32 32 Introduction to Twinmotion Materials, Characters and Vegetation https://rhino3d.co.uk/news/introduction-to-twinmotion-materials-characters-and-vegetation/ Tue, 23 Nov 2021 11:06:24 +0000 https://www.rhino3d.co.uk/?p=2326 Rhino3D Video tutorial in which the Simply Rhino team offer an Introduction to Materials, Characters and Vegetation in Twinmotion.

The post Introduction to Twinmotion Materials, Characters and Vegetation appeared first on Rhino 3D.

]]>
Rhino 3D Tutorial – Rhino to Twinmotion Series

In this Simply Rhino video tutorial we continue our Rhino to Twinmotion series with an Introduction to Materials, Characters and Vegetation in Twinmotion.

Image shows a screenshot of a scene with a central pavilion designed in Rhino and Grasshopper which has been rendered in Twinmotion.

Our senior Rhino3D Trainer, Phil Cook, starts by using the model brought into Twinmotion in our previous video Exporting data from Rhino3D using the new Datasmith Rhino Exporter.

Applying Twinmotion Materials and Characters

Phil first looks at geo-locating the pavilion model, before applying Twinmotion materials and characters. Phil demonstrates two ways of adding Twinmotion trees, plants and grasses before moving on to create vehicle and character paths for truly dynamic content.

Creating and Exporting a Video from Twinmotion

The video concludes with an explanation of the process of creating and exporting a video.

Rhino to Twinmotion Video Tutorial

If you would like to learn more then watch our Introduction to Materials, Characters and Vegetation in Twinmotion video tutorial below.

Information and How to Buy Twinmotion

What is Twinmotion? Twinmotion gives you the power to create high-quality images, panoramas, and standard or 360˚ VR videos from design data. It is an easy to use architectural rendering solution that is suitable for everyone.

Want to find out more? Visit the Twinmotion Product page on the Simply Rhino website.

If you’re looking to buy Twinmotion software visit the Rhino Webstore and buy Twinmotion online from the Authorised UK Twinmotion Reseller, Simply Rhino.


Helpful Links, Video Software Information and Credits:

Download Twinmotion from here: https://www.twinmotion.com/en-US/plugins

Model used in the video courtesy of Othmane Kandri

Video uses Rhino3D version 7 and Twinmotion 2021.1.4 on the Windows platform and all the processes described here are transferable to Twinmotion 2022.


Previously – Watch the recording of our AR/VR for Rhino and Grasshopper User Group Meeting (October 2020) to meet Epic Games (developers of Unreal Engine and Twinmotion) along with a presentation from Heatherwick Studio.

The post Introduction to Twinmotion Materials, Characters and Vegetation appeared first on Rhino 3D.

]]>
Exporting Data from Rhino to Twinmotion https://rhino3d.co.uk/news/exporting-data-from-rhino-to-twinmotion/ Wed, 18 Aug 2021 10:08:53 +0000 https://www.rhino3d.co.uk/?p=2219 Rhino3D Video tutorial in which we look at exporting data from Rhino to Twinmotion using the new Datasmith Rhino Exporter.

The post Exporting Data from Rhino to Twinmotion appeared first on Rhino 3D.

]]>
Sunset scene of a Rhino3D modelled Pavilion rendered in Twinmotion software

In this Simply Rhino video tutorial, our senior Rhino3D Trainer, Phil Cook, takes a look at exporting data from Rhino3D to Twinmotion using the new Datasmith Rhino Exporter.

Image shows a screenshot of a scene with a central pavilion by a waterfront with a skyscraper background, designed in Rhino and Grasshopper which has been rendered in Twinmotion.

This new technology, currently in ‘Preview’ mode, has many advantages of the ‘old’ way of moving data to Twinmotion. Layer structures are preserved, and Rhino objects can be moved independently of each other once inside Twinmotion.

Improved Rhino to Twinmotion Workflow

The new Direct Link, which also uses the Datasmith file format, makes it very easy to have Rhino and Twinmotion open together and push design changes in Rhino to Twinmotion for evaluation.

Rhino to Twinmotion Video Tutorial

If you want to know how to export Rhino data then watch the Exporting Data from Rhino3D to Twinmotion video tutorial below.

We have made a transcript of the video for anyone that would like to follow along using that – you’ll find the Rhino and Twinmotion (Datasmith Rhino Exporter) Video transcript PDF here

Information and How to Buy Twinmotion

What is Twinmotion? Twinmotion gives you the power to create high-quality images, panoramas, and standard or 360˚ VR videos from design data. It is an easy to use architectural rendering solution that is suitable for everyone.

Want to find out more? Visit the Twinmotion Product page on the Simply Rhino website.

If you’re looking to buy Twinmotion software visit the Rhino Webstore and buy Twinmotion online from the Authorised UK Twinmotion Reseller, Simply Rhino.


Helpful Links, Video Software Information and Credits:

Download the Datasmith Rhino Exporter here: https://www.unrealengine.com/en-US/twinmotion/plugins

Model used in the video courtesy of Othmane Kandri

Video uses Rhino3D 7, Twinmotion 2021.1.3 on Windows


Previously – Watch the recording of our AR/VR for Rhino and Grasshopper User Group Meeting (October 2020) to meet Epic Games (developers of Unreal Engine and Twinmotion) along with a presentation from Heatherwick Studio.

New and Available now – our next video in our Rhino to Twinmotion tutorial series is an Introduction to Twinmotion Materials, Characters and Vegetation.

The post Exporting Data from Rhino to Twinmotion appeared first on Rhino 3D.

]]>
AR/VR for Rhino and Grasshopper User Group Meeting | 25th February 2021 https://rhino3d.co.uk/events/ar-vr-for-rhino-and-grasshopper-user-group-meeting-25th-february-2021/ Mon, 01 Feb 2021 17:04:59 +0000 https://www.rhino3d.co.uk/?p=2079 Join Simply Rhino, AKT II and Mindesk for our AR/VR focused Rhino and Grasshopper UGM on February 25th. Sign-Up Now!

The post AR/VR for Rhino and Grasshopper User Group Meeting | 25th February 2021 appeared first on Rhino 3D.

]]>
Join Simply Rhino, AKT II and Mindesk, for our live and online AR/VR focused Rhino 3D User Group meeting.
Meeting Information:
  • This live online Rhino 3D event took place on Thursday 25th February 2021
  • Time – 18:30 – 20:00 (UK Time)
  • The meeting was recorded and the video is shown below

We had two live presentations, firstly from the Parametric Applied Research Team (p.art) at engineers AKT II in London. Then we heard from Mindesk which develops a real-time platform for CAD software that allows the editing of 3D models in virtual reality. Following each presentation we had live Q&A sessions with our presenters.

Meeting Presentations:
A promo image for AKT II p.art team projects

AKT II have extensive experience in collaborative computational design. They have developed a suite of tools including structural façade, bioclimatic and sustainable design.

This presentation will use a complex timber gridshell as a cover for a case study on the use of advanced digital modelling tools and their interoperability. Here, a mix of an in-house developed suite of tools paired with Rhino and Grasshopper allowed the AKT II team to respond to an extremely challenging site. Complex analysis on the timber gridshell allowed them to engage fully with the material performance, enabling the architects to maintain a ‘light-touch’ approach and providing a sustainable output.

Screenshot of a real-time rendering project using Rhino 3D, Grasshopper and Unreal.

Mindesk is a plugin for Rhino 3D that allows users to edit .3dm files in VR or have real-time rendering through Unreal Engine. The latest developments include multi-user VR sessions, compatibility with Varjo (retina-resolution headset) and Hololens2, and a series of advanced features for the Unreal live link created upon the feedback of prominent AEC professionals.

Mindesk simulation showing a car showroom with a white car added in Virtual Reality

Watch the full Video Recording of the meeting here:


Organised by Simply Rhino

Sponsored by BEAM and PNY


Watch the recording of our previous AR/VR for Rhino and Grasshopper meeting, featuring Heatherwick Studio and Epic Games, here.

The post AR/VR for Rhino and Grasshopper User Group Meeting | 25th February 2021 appeared first on Rhino 3D.

]]>
AR/VR for Rhino and Grasshopper User Group Meeting | 11th February 2021 https://rhino3d.co.uk/events/ar-vr-for-rhino-3d-and-grasshopper-user-group-meeting-february-2021/ Wed, 20 Jan 2021 17:12:28 +0000 https://www.rhino3d.co.uk/?p=2039 Join Simply Rhino, Softroom and Fologram for our AR/VR focused Rhino 3D and Grasshopper UGM. Meeting Recording and Video Transcript Available.

The post AR/VR for Rhino and Grasshopper User Group Meeting | 11th February 2021 appeared first on Rhino 3D.

]]>
Join Simply Rhino, Softroom and Fologram, for February’s live and online AR/VR focused Rhino 3D User Group meeting.
Meeting Information:
  • This live online Rhino 3D event took place on Thursday 11th February 2021
  • Time – 18:30 – 20:00 (UK Time)
  • The meeting was recorded and the video is shown below along with a transcript for download

We had two live presentations, firstly from architecture and design studio Softroom’s founder, Oliver Salway. Then from the mixed reality platform Fologram. Following each presentation we had live Q&A sessions with our presenters.

Meeting Presentations:
Virtual Reality Scene showing people gathered around a bar area inside a plane

Softroom | For the last 25 years, architecture and design studio Softroom have been creating world-class physical and virtual environments for clients ranging Wallpaper magazine, the Virgin group and Turkish Airlines to the V&A and the British Museum, and are currently developing concepts for the NHS and the BBC.

Softroom founder Oliver Salway will discuss his ongoing research into the opportunities and challenges VR/AR technologies present for architecture and how they may disrupt what we design for the physical world, from the living room to public space, as well as considering the wider issues surrounding appropriate applications for these technologies.

Virtual Reality Scene from Designers Softroom

Fologram | Holographic Construction with Fologram – Immediate opportunities and use cases for mixed reality in AEC.

Gwyllim Jahn is the co-founder and creative director of Fologram, a Melbourne-based startup building mixed reality software solutions for architecture, engineering and construction. Fologram’s software overlays physical environments with precise, interactive digital models that can be used for rapid prototyping of mixed reality applications to solve design and construction challenges.

Gwyllim will share how Fologram’s clients and partners have identified high value use cases and immediate opportunities for mixed reality to reduce construction time, cost and risk through several recently completed case study projects. These include work by All Brick to replace traditional 2D drawings with interactive, shared and in-situ holographic models describing all construction stages of the brickwork in the Royal Hobart Hospital, and the effective use of analogue tools and traditional steam-bending techniques to fabricate the complex double-curved structure of the Tallinn Architecture Biennial pavilion.

The talk will conclude with a brief demonstration of new software platforms for creating and sharing augmented reality experiences currently in development at Fologram. Rhino users are encouraged to download and install the Fologram for Rhino plugin and the Fologram for Mobile app in order to experiment with streaming their own models from Rhino and Grasshopper as there will be time for a brief Q&A around user experience at the end of the presentation.

Get Fologram:

Fologram for Rhino – www.fologram.com/download
Fologram for Mobile – search on the App Store or Google Play
Fologram for HoloLens – https://www.microsoft.com/en-us/p/fologram/9nn9sjwh9qc1

For more information, case study projects and examples see: 
www.fologram.com
www.vimeo.com/fologram
www.instagram.com/fologm


Watch the full Video Recording of the meeting here:


Organised by Simply Rhino

Sponsored by BEAM and PNY

Watch the recording of our previous AR/VR for Rhino and Grasshopper meeting, featuring Heatherwick Studio and Epic Games, here.

The post AR/VR for Rhino and Grasshopper User Group Meeting | 11th February 2021 appeared first on Rhino 3D.

]]>
AR/VR for Rhino and Grasshopper UK UGM | October 2020 https://rhino3d.co.uk/events/ar-vr-for-rhino-and-grasshopper-uk-ugm-october-2020/ Mon, 28 Sep 2020 15:03:42 +0000 https://www.rhino3d.co.uk/?p=1781 Join Simply Rhino, Heatherwick Studio and Epic Games, for our live & online AR/VR focused Rhino User Group meeting. This was our first online AR/VR User Group meeting and Heatherwick […]

The post AR/VR for Rhino and Grasshopper UK UGM | October 2020 appeared first on Rhino 3D.

]]>
Join Simply Rhino, Heatherwick Studio and Epic Games, for our live & online AR/VR focused Rhino User Group meeting.

This was our first online AR/VR User Group meeting and Heatherwick Studio started the evening’s presentations followed by Epic Games (Developers of Unreal Engine and Twinmotion). The meeting finished with a Q&A session with our three panelists. The meeting date was Thursday 8th October 2020 (18:30 – 20:30 (London/UK time)).

  • For the video recording of the meeting please go to the foot of this page

 

Heatherwick Studio Logo

Heatherwick Studio has been working with game engines as part of their design workflow for years now and have developed custom design workflows and techniques that enable these processes.

Silvia Rueda will provide an insight on Heatherwick Studio’s use of Immersive Media, with a focus on the role of landscape design and the use of Unreal within its design process and workflow.

Image: Courtesy of Heatherwick Studio


David Weir-McCall from the Epic Games Enterprise team will take a look at the many ways that people are utilising the power of the Unreal Engine in the AEC industry to go beyond visualisations, to help bridge the gap between ideas and reality.

Looking at use-cases in the industry we will explore the different integrated workflows with Rhino and Grasshopper and how they are being used to communicate ideas, design and build in real time, and link up to sensors to create fully functioning digital twins. This includes covering works by relevant partners including Mindesk, Speckle & BHoM.

Images: Left – Courtesy of AHMM; Right – Courtesy of SPP and Imerza.


 

Meeting Presenters:


 

Organised by Simply Rhino

Sponsored by BEAM

 

 

 

 

 

Thanks to both Heatherwick Studio and Epic Games for joining us at the meeting.

For details on the previous AR/VR for Rhino & Grasshopper meeting you can visit here.


 

AR/VR for Rhino and Grasshopper UK UGM with Heatherwick Studio and Epic Games – Video Recording Transcript

We have made a transcript of the meeting recording, if you’d like to follow that then here it is:

Paul: Right, welcome everybody.  This is the first of our virtual versions of our AR VR User Group Meeting, held here in the UK.  It’s actually the sixth of this type of meeting, but the first one we’ve held virtually.  We’ve met (for this format of meeting) before at AKT II at Grimshaw, at Bryden Wood and Heatherwick Studio Offices previously, and at SOFTROOM as well.

I’m joined by some friends here from… two from Heatherwick Studio, Pablo and Silvia who will be presenting first.

Pablo is the Head of Geometry and Computational Design at Heatherwick Studios.  Silvia is the Lead Designer of the Immersive Cluster at Heatherwick.  So, they’ll be presenting first for 30 minutes or so, and then we’ll be hearing from David Weir-McCall from Epic Games, part of the Enterprise Team in the AEC area.

Just a couple of other things to mention here.  There’s quite a big group joining us.  There might be as many as 300 or so, so please with questions, if you could address them in the questions panel rather than the chat panel, that would be great.  They’re going to be monitored by myself and Steph who is in the background helping out.  So, yes, please add them in questions.  As there is quite a lot of you, there could be potentially quite a lot of questions but we’ll do our best to get as many questions to the presenters as we can.  There is also the chat dialogue opportunity.  You can use that to talk between yourselves, if you want to communicate with anyone else that you know is also participating.

What else is there to say?

We’re having this presentation first from Heatherwick.  There’s a couple of polls that we’ll ask you to complete.  Then we’ll hear from David, then Q and A’s for both presenters, then a round up.  Then after all of this, there is an opportunity to join us on the Mozilla Hubs platform, a fun little meeting, because normally after these things we would have a nice social meet up, some pizza some beer.  We can’t do that this time of course, so we’re going to invite you to come along to the space at Mozilla Hub.  Some details on that will follow after everything.

So, what I’m going to do now is handover to Heatherwick people.  Do you want to just say something as an introduction, Pablo, first?

PABLO: Sure.  I think we’ll jump on the presentation.

PAUL: Okay, I’ll jump out and see you all later.

PABLO: Okay.  Well thank you Paul and Steph for having us here today.  We’ve been part of this AR VR community for some time now and we love always to see what is happening in the rest of the industry and obviously we’re very happy to do something this time around.

So, we are from Heatherwick Studio, and we are a team of problem solvers and designers based in the heart of Kings Cross.  However, during these times, I think we’re mostly working from different parts across the UK, from our own homes.

Today’s presentation is going to focus on the studio presentation and specifically on our Unreal Engine workflow and how we use it for landscape design.

We are going to try to split the presentation in four main chapters.  The first one is going to be covering how we use these visualisations in the studio, and then we’re going to talk about the landscape design and the relationship between this and how we visualise.  Then we are going to jump in to a case study of one of our projects and we’re briefly going to go through future developments.

So as Paul mentioned, my name is Pablo Zamorano.  I am Head of Geometry and Computational Design Department in the studio.  I work across all studio projects and with a team of designers that also are passionate about engaging in complex design challenges and digging deeper in terms of how things get together, from early stages to the very latest ones.  I also work with the great Silvia.

SILVIA: Hello, my name is Silvia.  I am a designer and Immersive Media Specialist at Heatherwick Studio.  I have a background in architecture and interaction design and my focus is to develop and communicate this to the design ideas using Unreal Engine.

PABLO: So, as I mentioned, we are based in London and we try to focus on projects across all different scales and types, and we not only design buildings, but also we design objects, or landscape design, as we will see in today’s lecture.

As I mentioned before, we work across scales and typology at every possible location.  I think our main focus is to find projects that can potentially allow for a positive social impact wherever we are working.  We have a special focus on material, craftsmanship and we are really focused on how things actually feel for people at human scale, at one to one scale.  We like to generally design that you can approach to them with your body and you can feel them and understand them, as positive elements of the human scale.

These are three examples of recently finished buildings, one in Kings Cross, called Coal Drops Yard, the middle one A Thousand Trees in Shanghai and the third one, The Vessel in New York.

We also have a focus on Applied Research and particularly in my department, some of the things that we cover are trying to find the relationship between not only the physical world, but also the digital one.  We try not to get ever too interested in anything we do.  Rather we try to always think very deep into the ideas and how we can develop them further.  So, we use any tool we have at our hands, or not, and if we don’t have the tool at our hands, we try to either look for it, or try and find a way of finding it.  So, really, we try to open our spectrum of interaction between the digital or devices or fabrication methods and craftsmanship.

We apply this through different tools we have put together through the years and we can now run simulations of the buildings, but also allow us to quickly get through every single layer of any complexity and scale that we want to work on.

We are also partnering with different organisations, private and public and some schools like the IAAC in Barcelona, where we’ve been trying to focus on advanced fabrication and using, in this case, the material as wood, and working with simple elements, though trying to use them in a much more complex way and trying to solve, or bridge the gap between complex design and the fabrication materials we use.

So, you can see in this case robotic fabrication, we tried to realise the designs that the students put together in this short workshop.

I really love how robots move, and it sounds very interesting, the potential that we can get from them, we’re also quite interested in the relationship between our body and the machines we work with and we understand what are the limitations of a robot and what are the limitations of our bodies.  So, we try and bridge this gap with tools like Augmented Reality.  For instance, in this case, we are using the HoloLens and mobile devices to put together assemblies that later on we can add to the bigger pieces manufactured by the robots.  So, it’s an overlaying of AR and the physical things, and here are some examples of the final pieces.  So, again, a piece like this obviously is, I would say, too complex for a robot to run through the whole process avoiding clash between the optics in some parts and also too complex for a person to put together without any kind of traditional guidance.  So, I think it’s very interesting how these two worlds merge together allow us to explore geometry further.

This is another example of the use of Augmented Reality in the studio.  This is using Fologram and the HoloLens, this is in our workshop, where we’re using it to wire cut some foam blocks, and you can see how the model in Rhino 3d is moving, following how the physical object is moving.  It’s not unidirectional, but also goes back from the real world back then into the digital world.

Here is how we use it (Fologram), not only as a presentational tool, but rather to understand how digital mock ups can interact with physical ones.  So, we’re testing a small object, a lift button, and then we have another digital option for the same object.  If you look at it on the screen, it may look okay, or you can say whether you like the design or not, but when you look at it with the goggles, you realise that you’re missing with your body, and it’s obvious this object is far too big and we can action these things in real time.  So, the understanding of scale is key when we’re using Augmented Reality.

We’re also teaming up with some other people, in this case, Thornton Tomasetti, the engineering team, and trying to look in how we can customise some of these tools and we had this big ambition to have an AR tool, where we can not only import models that we’ve created but also physical models that we’ve done or even sketches and then turn them in to something else, like running any kind of modelling or sun exposure, or wind for analysis.  And then also maybe editing the geometry from the model devices and then placing them back.

Obviously for this, we did a quick sketch actually over one day.  So, we shortened the spectrum to maybe three areas, so having an import, being able to transform the geometry and then place it back and run some sun exposure analysis and with the key thing in common, that the tool should work cross platform.  So, it’s not an app that only worked for Android devices or IOS devices, but rather is in this case web based platform where you can access on either a mobile device or even your computer.

This is the result.  So, one of the interesting points here is that you can place a geometry base on a tracker, so once as you move the tracker, basically, the geometry will follow, and what you see on the screen on the upper right hand, are the controls of the parameter, basically that we allow the users to control this, the scale, the rotation, but also the day of the year and the hour.  So, you can see not only the object moving but also the environment reacting to it.  Because it’s basically following the tracker, if you had to place it in a different space, you can just literally use your hands and move it along with you and it will follow.  So, we’re very excited to carry on this collaboration with Thornton Tomasetti.

SILVIA: So, understanding that, I’m going to do a bit of a jump and we’re going to talk about landscape and how this is integrated in this design vision.  Basically, we can say that we have nature embedded in most of our projects, from different scale projects, large projects, Toranomon-Azabudai on the left in Tokyo and 1000 Trees on the right in Shanghai as well as medium sized buildings as the residential tower on the left and Little Island in New York on the right, and as well, small buildings, for example, pavilions that we invented with landscape, like the recently integrated Maggies Centre in Leeds.

So, how does Heatherwick Studio see visualisations?  We think it’s critical, that a richness needs to be to applied in most of our visualisations.  So, this means that traditionally, we used to have a series of stills that had a combination of VR, V-Ray rendering and adding some planting in post-production, even some hand sketches, means that it limits our ability to present the design model in a 3D way.  So, it means that landscape becomes something that goes on top of the image at the end and it doesn’t go along with the design process of all the geometry and architecture.  This is very challenging when it comes to projects like Little Island where planting is almost as important as the whole architecture itself.  So, it’s very difficult to imagine how this would be without any planting.  It looks very incomplete and empty.

So, for example, this is how landscape defines and gives context meaning in architecture and geometry.  This is Al Fayah Park in Abu Dhabi where if we look at the architecture elements, it’s very difficult to understand the scale and the sense of the space that we are designing for. But if we implement and we put all of the landscape over it, then we have the whole picture and the idea of what we want to see and what is the main concept of it.  In this case, we use Twinmotion.  This was five years ago, so it was the first 3D animation and planting, all placed together with architecture.  It was great because the client and the consultant’s for the first time really get to see a 360 of both of these things placed together in one animation.  So, before moving forward, describing out what is the visualization workflow, we wanted to give an overview on how this sits in our wider design interoperability workflow, which we developed… basically, we develop all of our projects using different softwares and platforms in different stages.  Rhino is the main design software for all of the stages.  Revit is very important in the later stages and Unreal is the main visualisation tool.  So, in this diagram, we can see which are the highlights of each of the stages, and understanding that for concept and schematic design we use Rhino and then we would visualise it in Unreal and then for the later stages, Revit and we can have a link from Revit to Unreal, for sure we will use it in the later stages.

So, for Landscape Visualisation, we developed our own workflow basically.  Because we are looking forward to seeing the more things then just renders and videos, we are as well researching in 360 views, virtual reality and immersive media.

As part of the studio setting, we evaluate the different ways of visualisation, like visualising landscape.  So, the most traditional ways that are Rhino and then possibly the options in Photoshop or V-Ray for Rhino, and then we are going to compare it to Rhino and Unreal.

So, in the first way, we use Rhino to just render out simple view, then we apply all the vegetation in post-production.  This will give us a fully customised image at the end with a good quality but it can get very time consuming, especially if we want to represent more than one still.  Making changes can get very tricky as you will just enter into a world of Photoshop and layers that is very hard to handle.  As well, we will never be able to create a real time view using this method.  When we use Rhino and V-Ray, we realise that it has an amazing quality and it is very photo realistic results, but we realised as well that because all the polygons of the trees that we want to render is massive, so it will take tons… an astronomical amount of time to render more than a few views around it.  As well, we’ll not have real time alongside the project.

So, after comparing all of these methods, we realise that Unreal is the way to go.  We can see the project in every possible angle.  The landscape is already embedded in the real time modelling.  Everything is fully customised and the most important thing is that apart from customising the look and feel of the project, we as well can customise all of the landscape assets that we want to implement and need.

So, this is the first project that we used Unreal in.  It’s 1000 Trees in Shanghai.  As the name says, it’s a lot of trees and a lot of landscape.  So, we wanted to be very accurate towards it and this is the first video that we did, animation.  This took us around two months to get our hands in.  This is quite a lot of time to do an Unreal model, but from here, we learnt quite a lot of things for in the now future, we just take a few days or a few hours to develop an Unreal model.  So, this was pretty exciting at the beginning as well.  Client was seeing the project years in advance with fully accurate planting and the 360 view and you could see the whole scheme, all placed together in the contextual site.

So, it’s quite nice to compare the reality that is very much in the left, that is almost in completion, and the real that we had years in advance.  So, from this, as I was saying, we learnt a few things.  The first is to create a Heatherwick Studio template that will begin with some materials that we commonly use in our projects and as well, the atmosphere of Unreal would already be adjusted.  Going forward, we built a landscape asset library which apart from having different species of pants in it, the most important is that we have different types of plants.  So, this means we will have conical trees, global trees, quite a lot of types of trees.  As well, this can be subdivided into different categories, depending on where the project is located geographically.  So, this means that if you start a project in California you already have a folder full of the planting that you will use in California and it’s accurate to the surroundings.

The last one is that we improve our workflow with Datasmith and we use Rhino as a main tool of design and it’s quite easy to update any geometry in Unreal.

So, as a conclusion, we were trying back in the day to produce just one Unreal model and this would be a development design and it will be referenced until construction.  What we are trying to do now is to have different real models since concept until construction, which will be evolving and changing all the time, meanwhile we are changing design.  Now for landscape design process, basically landscape design plays a key role in the visualisation in most architectural projects.  Despite being so important, it’s often overlooked and considered only as an interest that is placed in later stages of the visualisation process.

So, basically, to sum up what is our landscape design scope, it can be divided in two main packages.  Hard landscape, that is everything that is manmade structure, so it will be pavement, furniture, landscape structures, and soft landscape, that is everything that is live components, so it will be trees, all planting, land form and water features.  So, we can divide it in four layers starting from the bottom, land form and water, then hard scape and furniture, then lower planting and tree planting.  We can conclude that really the true heart of landscape, is the two top layers, that is the tree planting and the lower planting.  So, we can see here the comparison on the left is just the first two layers, and in the right, we can see the third and fourth layer, which gives the very characteristic aspect of what is landscape.  So, this means that we should pay attention and get the whole picture and make it accurate if this is the most seen and most important part of landscape.

So, this is normally what we will deliver in our package landscape from Revit.  So, you can see it is very dry, it has a very traditional method of communicating all of the design information.  So, we have in the right technical drawing, defining areas of planting colours, each of them colour coded to differentiate different hatches that we schedule each planting in the left, and we describe a specific species of plant.  This sounds very crazy.  Apart from that, we will combine this with a reference of images that will define what we are saying in the schedule, and from there, there is a massive jump between what is the technical drawing and what the landscape architect asks and what is the deliverable images.  So, using Rhino and V-Ray and Photoshop, there is a lot of artistic interpretation in what we are seeing and what really was delivered, making us a lot of manipulation in each image and we are lacking a lot of fidelity and accuracy towards what we are describing.

So, to conclude, we can see that the landscape design is received as very technical and often detached from the project.  It’s very hard to visualise properly landscape as it is very time consuming and normally the quality is not that good.  We have a clear need to make planting design and development more inclusive and interactive.  That means that we wish to involve more the design team and the clients in the landscape design decisions.  So, for this we use Unreal and we think that this is the best way to visualise landscape.  It’s very dynamic.  It’s a dynamic tool that will allow everybody to comment and to see the outcome immediately.  Planting is accurate and you get a holistic view of an integral project of landscape and architecture.

So, we have this case study Tokyo Japan, where there is a massive mixed use development which includes residential and retail, office and education use and it has an extensive area of landscape.  We are trying to create intimate human scale gardens and as well, a large scale city.  As you can see in the plan, it’s full of landscape.  What we tried to do was generate different narrative and characteristics of the landscape.  So, in the entrance, we wanted to generate gateway plazas with cherry blossoms, to invite everyone to go in.  In the middle, we wanted to create urban orchards so that people could interact with all the landscape and as well, woodland grasses.

But we are just going to focus and speak in the central garden.  So, this is an enlarged plan of the planting scheme.  So, basically, we have different mixes of shrubs and grasses and these are the description of each mix.  Each one will have different species that will describe colour, texture, height, if they are evergreen and so on.  For example, we take shrub mix 01, which will have a warm colour, red, orange.  We already know the heights of all of the planting, so what we want to do is bespoke this in Unreal to make it as accurate as we want.  So, by loading a lot of real assets in to our project, then we can edit the colours of any of the flowers.  We can remove some leaves and begin to create seasonal changes if we want, as well as we’re seeing at the top, we can have a cherry blossom tree and then just by changing the colours of the leaves, it becomes another type of tree.  As well, we can combine two assets, so if we have ivy and we combine it with some flowers, it can become a beautiful jasmine flower that we can hang anywhere that we want.  So, from doing this, we recreate all of the planting that the schedule was showing us.  As well, for the grasses, what we did was duplicate one of the simple shrubs that we had in Unreal and create a new static mesh, and for herbaceous ornamentals as well, there is a combination of grasses and flowers.

So, when we have already created all of the assets, what we need to do is create some folders in Unreal that will have all of these static mesh inside, or all of the plantings and then we import just the surfaces that are already colour coded with the folders, so that it means that anybody can jump in to Unreal and not be a landscape expert and already can plant and make any design decisions towards this.

So, this is explained here in a small video, where we have the surfaces with the colours, the folders, they are matching the surfaces.  Then we choose the plantings.  We drag and drop in the folder to Unreal.  We select which ones we want to use and then from there, we change the density that means how much planting you want to paint when you want to paint it in Unreal basically.

So, the density, this is a bit of trial and error, and then you just… with one click, you will immediately have all of the landscape in the surface that you choose to plant it.  So, then you will be exactly the same for the next one, you unclick the ones that you had and then you just click the new mix and you will paint it as well.  That’s how we began to create all of the lower layer planting.

As well, we do it for the grasses.  That is a… it’s very difficult to get accuracy in grasses, but in Unreal we can manage to do that.  At the end we just hide the surfaces that we imported, we turn on the soft scape and we can meander inside of the project and literally see if it looks good or not.  Sometimes, you just don’t have a say in these situations and it’s super nice to design and see if you can change some of the design things that you already thought about.  Sometimes you just need to do that right.  You can walk around and have a real time 360 view of all of the corners of the project and decide on all of the landscape that you need.

In summary, you will have a blank model for the colour coded areas, place the lower planting then import the bubble trees and then you will change them for real trees.

This means that we will have a fully coordinated model in Unreal, giving us the most accurate planting that we can have.  We will have more than one final landscape version so that we are encouraging people to mix and test different types of versions to see which is the best combination for your project.  So, it means that Unreal will allow us to interrogate landscape design in a more holistic way, involve all of the teams and consultants in landscape design progress and decisions and the most important is that we will be spreading the knowledge of landscape design all around architectural domains.

PABLO:  Following from what Silvia just showed us, we are looking in to future developments and some of the things that we really care about are these workflows.  How do we come up with the easiest ways of communicating the different softwares we use?

So, in this diagram that Silvia showed before, these three main softwares we use for design being Rhino, the key backbone of the process all through, from early on till later status.  So, how can we connect these three softwares together, is something that we’re always asking ourselves and trying to improve.

So, about a year ago or so, we teamed up with Mindesk to answer this very question. So, how can we make this connection between the software as smooth as possible, knowing that Mindesk already had some of this potential built in, so they were using an Unreal Engine to turn the Rhino environment in to a virtual reality one, and we thought well maybe, if these two softwares are already talking to each other, there must be a way now to just progress that in to a link where we can have both environments at the same time.

Here you can see some of the results of this and there is… this is now live and is a tool which is part of what Mindesk offers as part of the software.  So, if you modify the geometry in Rhino, on one side, then Unreal will automatically show you what changes are happening.  This means there is a direct pipeline between the two worlds and so any geometry that you move is going to move across, any geometry that you import is going to get imported across live.  If you modify it, it’s going to get modified.  If you hide it, it’s going to hide on the other side as well.  So, it’s going to make this process of turning these lollipop trees in to real or more realistic and real assets much easier.  So, here you can see how things are moving live.

And another thing that is very interesting is that most of the people that engage with Rhino in the studio, don’t necessarily understand how Unreal works.  So, here there is a view link where you can actually control the viewport in a real directive from Rhino.  So, if you don’t understand how to navigate to the Unreal world, you can still modify your views from the Rhino viewpoint and then Unreal is going to mimic that.

Very recently, the tool has been evolving and you can now automatically modify the assets that are going to be assigned to this.  Unfortunately, we don’t have a video showing this because it’s come up quite recently and we cannot record it.  But imagine these lollipop buildings in Rhino will automatically get converted in to more realistic versions based on our landscape template.  Of course, this is not only related to what happened in traditional Rhino geometry, but also can be driven by Grasshopper geometry.  So, this obviously has a huge potential for us, because you can quickly test different options, but also to animate things in Grasshopper very quickly and see how things move up or change in the Unreal world.

Also, as Silvia mentioned, we engaged with Twinmotion something like five years ago.  Twinmotion is the very reason why we started investigating realtime rendering in the studio and we are also excited about the new developments of this software and we are currently testing it and super excited to hear possibly more about what is coming up with this in the next lecture by David.  But we really like the ease of use of Twinmotion and we love the fact we can control very quickly the season and make it change.  Now, one of the things that is quite key for us, is to be able to modify the planting assets and to kind of customise with our own planting, based on our own landscape design.  So, this is something we’re hoping Twinmotion could introduce in the future, so David, please take note of this.

I think with this, we can finish our presentation.  So, thank you so much and we will be answering your question.  Thanks.

Paul: Okay, great, thank you so much Pablo and Silvia.  Okay, I have some questions for you.  I’ll start with some workflow questions I think, first.  We’ll just start with this question from James.  Have you started to use Rhino Inside Revit?  Has it been useful for you or have you found it not developed enough for you yet?

PABLO: I guess this is for me.  The answer is yes.  So, we’ve been using Rhino Inside for quite some time at the studio already.  We already present our internal templates that focus on different customised workflows that we always use.  It’s been an interesting journey because Rhino Inside obviously it’s under development.  It means that one build may change from the previous one.  Any attempt to standardise something or a process, may be in need of reviewing with the next release of the Rhino Inside version.  But we’re still very excited and actively using it in very large projects actually that are now gearing up for construction. So, it is a very powerful tool and it’s not only for Revit.  I’m also hoping to hear more from someone who is using it a bit more with Unreal.

The other very good thing about Rhino Inside is it allows you to hack it.  So, we have some coding… if the tool is not doing it for you, you can actually come up with a tool that will do it.

Paul: Okay, I’m just going to go down these questions in any order now.  Are you looking at procedural generation of plants?  I guess a question for Silvia?

SILVIA: We have seen it by literally creating the trees by scratch but we have developed a better and faster workflow by using already the assets that Unreal gives us.  So, it is literally a combination of all the libraries that are out there that people are using and making it our own.

Paul: Something else here from Kevin.  At the point of workflow development that Silvia was talking about… I’m just reading this out, so when you say Unreal, are you building this library and templates for Unreal directly, or a library for Twinmotion?

SILVIA: It’s for Unreal.  We have it already, the foliage in Unreal and it’s just for Unreal.

Paul: Thank you.  Question from Lynne.  Can the assets library be shared across teams and projects so everyone has the same and updated assets all the time?

SILVIA: Correct.  We start with a template and then we will be narrowing down where the project is located and then we will give the template with already the library of plants that we think we can be using there.  If not, we will add more plants if needed to that template, depending on the project.

Paul: Okay, thank you Silvia.  Right, do you use VR with clients and/or collaborate within Unreal?  Question from Martin Johnson.

SILVIA: Not at the moment.  We are the owners of the geometry and Unreal itself.  We don’t share Unreal but we share its animations.

PABLO: So, to the point of VR with clients, I think the answer is yes, sometimes.  In some of the projects we’ve been working on, the clients actually turned out to be quite sophisticated and they have asked directly for either walkthroughs or a 360 animation where they can use their own VR domes to walkthrough the project and similar things.  So, yes.

Paul: Another workflow question.  Maybe David can answer this, or maybe you’d know as well Pablo.  Does Unreal have a Revit plug-in and how does the Revit, Rhino, Unreal workflow work?  Just before you answer that, it’s probably a good time to actually just mention that we do have a sponsor for this meeting, and that’s the developers of BEAM, which is a solution for interoperability between Rhino and Revit.  Anyway, I know Pablo has used BEAM and may be an advocate for that solution as part of a toolset, but anyway, did you get that question Pablo?

PABLO:  Yes.  So, we haven’t connect Unreal and Revit directly yet.  That’s the quick answer, but there are ways to doing it.  I know there is some development using Rhino Inside to do this, and also I know that Mindesk… Gabriella may be somewhere in the audience tonight, I hope.  So, hopefully after this, in Mozilla Hubs, you can try to find me and ask directly, but I can tell you that some very good news about this may be coming your way from Mindesk.

Paul: Very good.

DAVID:  I’ll elaborate in the next presentation.

Paul:  Quite a number of questions here.  I don’t think we’re going to get through them all but let’s see.  Right, so Pablo, have you tried Unity?  How would you compare Unreal versus Unity?  That’s a great one in terms of ARVR interface, software interface or within your workflow?

PABLO:  We have tried many different platforms and I think the answer of why we selected Unreal is because of the quality we can achieve with it.  So, I know some of other like Unity for instance are closer to the larger hacker community.  People want to dig deeper in to these rabbit holes in software.  But to be honest, what we really care about is the actual quality we can get from a software at the end of the day.  For us, the best results so far have been coming from Unreal, and this is why we engage this.  But yes, we always try any possible solution that is out there in terms of the ease of use and the quality.

Paul: Great, thank you.  Silvia, are you using Speedtree to create the vegetation or are you using a library of some sort?

SILVIA:  So, as I said before, we are using just the Unreal libraries from Epic Games.

Paul: Fine, okay, could you talk a little bit about how you onboard clients in to using and viewing the work in Unreal and generally in ARVR, and how receptive they’ve been to this?  It’s a question from Pam Harris.

SILVIA:  So, basically, it’s a win/win situation.  They love it.  They are very engaged with it.  They are always asking or wanting Unreal.  It’s the best way for them to see and understand all of the architectural terms and everything we are talking about and when we engage them with Unreal they get the whole picture and it’s very clear.  The next steps that we need to develop are super clear.  Sometimes it’s very tricky because we have a lot of things to finish because you can see everything, but I think it’s very useful. I don’t know if Pablo, you can say anything else about it?

PABLO:  No, I think that says it all.  We use it for every single presentation with the client.

Paul: Very good.  I think one more question from Libney.  Is it possible to upload the Unreal scene to the Cloud, so the client can check in a web browser?

PABLO:  I think that’s a question for David I think, but there are many ways of actually doing this and I know you can do this in Unreal and export in many different platforms, not only web based but also augmented reality models and so on.

Paul:  Okay, so maybe that’s something we can come back to.  Okay, I think maybe… I’m just going to ask you one last question, and then we’re going to go to David’s presentation.  But if there is time, if we’ve missed some questions and you really want anything, if there is something really pressing, please do let us know that it’s an urgent question and we’ll make sure that we get to it at the end.  I’m trying not to leave people out but it’s a bit of a challenge.  Okay, let’s see.  What are you using for version control, mentioning Per Force SVN or something else?  That question doesn’t mean much to me.  Do you understand the question?

SILVIA: No, I didn’t, sorry.

PABLO:  So, how are we dealing with different versions of Unreal?

Paul: Yes, I guess so.  They are talking about asset version control.

DAVID: It’s for multiple users to engage in the same Unreal.  So, are you using it as across the office for multiple users on the same scene, or are you using it for individual users for individual scenes?

PABLO:  We are using it for individual user for individual scene.

SILVIA:  Correct, but again, we also began to use levels so everyone can jump in to the level and change anything that they need to change in Unreal.  Like somebody will have the core Unreal model.

Paul: This is the last question before we go to David.  Do you do all the lighting in Unreal or do you add additional surfaces to be used as light sources in Rhino for instance?

SILVIA:  The lights, we use them all in Unreal, yes.

Paul: Fantastic.  Thank you Pablo, thank you Silvia.  We’ll see you again at the end.  We’re going to hand over now to David.  Thank you very much.

Okay so you’ll be made presenter now David, and I’ll say see you later.

DAVID:  Okay, you should be able to see my lovely background wallpaper.  First of all, huge thank you to the guys at Heatherwick Studios.  They just did a great job of showing you kind of what I want to share with you as well, which is some of the great use cases which are coming out of the architectural engineering construction industry.  So, I’ll quickly start my presentation, but before we get started, I just wanted to share a quick introduction for those of you who maybe aren’t aware.  I work within the architecture engineering industry within Epic Games, which basically means that I focus on in speaking to architectural engineering construction firms about their use of real time rendering tools in the workflow.  There are ways to innovate work processes and outcomes.  And what we do is we go around and we talk to a lot of people about a lot of uses of Unreal, the different uses of Twinmotion and the big thing that we usually do in these presentations is we like to share the use cases behind it, and Heatherwick is just one of those great use cases in landscape design and how their workflow has come out.

I want to share another couple of examples with you today, but first of all, just in case there is anyone on the call who is a little bit unsure of the Unreal Engine or Twinmotion, I just want to spend a couple of minutes just quickly running through that for everyone’s benefit.

First of all, what are we?  Well we’re Epic Games, and we have this great platform called the Unreal Engine.  Now this is what is used to create a number of the games that you may recognise, that big one up there Fortnite, which I recently joined a week ago, and I get my ass kicked by nine year olds on a daily basis now.  But it’s also used as a background game engine for Infinity Play, Gears of War, and we also license out to both the games sector and the non-games sector.  So, other game studios are using Unreal Engine as their tool.  The non-games is where I sit and a number of other great guys sit as well, the film and media and the broadcasting and the automotive and manufacturing.  If anyone is a fan of Star Wars, The Mandalorian was filmed or used Unreal Technology which is exciting.

But our big thing which I think again, Heatherwick did a great job of which is why we’re in the AEC space, is this ability to bridge ideas and reality together. Within the architecture engineering construction community, our output isn’t the same as the games or the film.  We create reality, real building, and reconstruct those from ideas.  So, we see these real time render tools as the fastest possible way to share and engage stakeholders and we see outputs of those ideas.  Again, some of which we’re going to see a little bit later on.

But again, if you’re unfamiliar, we have two lovely products.  We have Twinmotion, and we have Unreal.  They have both got their different use cases and I just want to define exactly what they are so that you can understand when we talk about things moving forward, where the use cases for each of them sit.  The way that I usually describe Twinmotion is this idea that it’s architectural visualisations in a few clicks.  Essentially what Twinmotion is, it’s the Unreal Engine but with this wrapper around it, which has been customised and set up for ease of use for a very quick and simple learning curve. So that you can create great visuals and within a few clicks as I said, not one click but a few clicks.  It’s the comparable tool to what we see people doing with Enscape and Lumion.  It’s just there for the everyman, the every architect and engineer can have this and work alongside the propriety tools, whereas what we… here is a quick video demoing it.  What we really have in here is what Heatherwick alluded to and it’s a number of key things including how it speaks to Rhino, the way that you can use the assets whenever they are in the Engine.  We have a thing that we like to call Smart Assets, trees that interact with environment and people and animations and cars and things like x-ray materials for engineers to be able to see their designs.  The other great thing we see about Twinmotion is this asset library of about 2500 assets.

What’s coming very shortly, because Heatherwick were asking about road maps, is, these assets are about to be released on the Unreal Engine Marketplace.  So, these assets aren’t just fixed to Twinmotion but they can actually then be used in the Unreal Engine as well.  So, we are really excited about that.

So, the Unreal Engine then is very different, how we describe it is very different.  We see it as an advanced 3D real time creation platform.  So, it’s the place that you take your visualisations and you advance them to the next level.  There is different use cases that we see people doing this in, but really it’s about that engagement, that virtual reality, augmented reality, creating these outputs, these UI configurators that give you an extra element of control.  So, it’s not just for visualisations, it’s for going beyond that.  We see it being used across the industry in a variety of different ways, and I’m excited to get to share a number of those which you’re seeing on the screen in front of you today, and tell you a bit more about they’re using the Engine.

But in terms of both these softwares, I guess the things I really want to cover which may pertain to you guys a lot because I saw a lot of the questions focusing on this, why people use the engine and what the engine has to offer, both Twinmotion and Unreal.  I usually sum it down to these four things.  But today I’m only going to talk about one of them in more detail.  But we have this great ecosystem.  We acquired Quixel which is the amazing material library of high quality materials and assets which now syncs seamlessly in to Unreal and Twinmotion.

Data aggregation is working with your proprietary software tools like Rhino and Grasshopper, and we have plug ins that are free, fresh out the box and optimised for these platforms, ready to use from the second you open up the platform.  Same with collaboration.  From day one of opening up Unreal, there’s a template which allows you to create an experience that you can share with other people, and have multiple people exploring the same space at the same time even across the world.

Then the last one is, the assets that you have, once you have these visual assets in 3D, we want it to be as open and flexible to do what you want with it as possible, as in, you could have it as a render today.  You could have it as an animation tomorrow.  But if in a weeks time you want to turn it in to a virtual reality experience, or a web based application, or a desktop game, then those options are all available.  So, you can customise the experience around your use.

But the big thing I want to cover, because I just heard a lot of it in the Q and A is, this idea of data aggregation and how it is used with these external tools, with a special focus today on the Rhino and Grasshopper side.  But as I said earlier on, we have this built in tool in Unreal that we like to call Datasmith, and Datasmith’s job is to convert these external assets in to Unreal assets, in a very non-destructive way.  It turns Rhino assets in to Unreal Assets.  It turns Revit assets in to Unreal assets and it’s just meant to be this very quick way that you can then also optimise with automation, with a thing inside Datasmith called visual data prep where you can pre-customise essentially a script which runs every time you import your model, that will do all the prep work ahead of time.  You have a material that you know you have a nicer material of in Unreal, but you can get it to automatically replace it.  With Heatherwick, they had obviously trees that they wanted to replace or they wanted to put in their place, you can get it to take all these trees from the Rhino model, and replace them with these lovely trees in our Unreal asset library.  So, that’s what Datasmith is there to do and it’s constantly advancing to be more and more real time.

And it’s not just us.  We are huge supporters of makings sure that our platforms speak broadly across the AEC.  I think there is this common understanding and everyone on the call will be aligned to it.  We have a big interoperability problem within the AEC, or opportunities, in ways that we work and a variety of different tools.  We need them to speak to each other better, and some of the projects and some of the tools we’re seeing emerging by people within the AEC to answer that call, are really exciting.  Tools like Speckle and the BHOM by Buro Happold the work which is being done within Rhino Inside (from McNeel) and also Mindesk.

I just want to quickly touch on these for those of you who are unaware.  First of all, big news I think last week, that we now have an official Rhino exporter in to Datasmith.  Before it’s always just been an in-built thing in to the FBX, but now it’s been optimised for the tool, so we’re really excited.

But Speckle and BHOM are the two big ones which we see a lot of potential and development in.  If you don’t know what Speckle Works is, Speckle is this amazing open source data platform, which is looking to answer the problem of these multi-programmes or workflows, in a way that is a cloud based system, that will share data and geometry across platforms that you can have Revit assets that you can then see in Rhino in real time, that whenever you make a change in Rhino, you can see it in Revit.  It’s this interconnected system that both the Speckle and BHOM are looking at addressing, and integration with Unreal has been explored with HOK and Mobius Node.  We are supporters of people exploring this space of interoperability.  We gave them a mega grant and they’re working on developing this tool to create this more integrated system through Speckle and making Unreal part of that equation.

Similarly, with the great work that is going on with Rhino Inside, Rhino in a general is a C Sharp tool, which is preferably Unity based.  But for those of you who are using Unreal, there is this external wrapper that has now been developed called U Sharp, which allows a C Sharp interface in to Unreal, allowing you to use Rhino Inside, within the Unreal Engine.

So, there’s different tiers of expertise you need for each of these, but it’s really interesting to see how people are rising up and addressing this interoperability pipeline workflow.

The last one which again got a lot of focus on in the previous presentation is Mindesk.  It has to be the simplest of them all, of creating this real time, synced, bi-directional workflow of Rhino and Grasshopper and Unreal with the added bonus of having this great interface that allows you to model and work in a virtual environment and see that automatically changing in Rhino and Grasshopper.

So, in general, we’re really excited about the way the industry is approaching this problem of data aggregation and bringing… not just for the Unreal Engine, but for the entire industry, and we’re happy to be a part of that.

And the last thing, just before I jump on to seeing some cool stuff, is this idea that we have this great tool called Twinmotion and we have this great tool called the Unreal Engine and what we’re really excited about is this process that you’re very soon… we just released the Beta of it, where you’re going to be able to export models out of Twinmotion and import them in to the Unreal Engine, which basically means you can create this very quick and beautiful architectural visualisation using Twinmotion, very quickly, simply.  Throw in some lights, some assets, and then export that entire thing in to the Unreal Engine to then add that next layer.  So, beyond visualisations into something else.  This is for the digital specialist, the visualisation experts or UI creators that they can add that layer on top.  So, we see this streamlined workflow that can be used across the process, from architects to engineers, all the way through to the technology specialists.  So, we’re really excited about the development of that.

So, I want to spend a little bit of time talking about this which is really the many different ways that we’re seeing people use the Unreal Engine.  It’s talked a lot… we see a lot of people using it for architectural visualisations and VR, but we sit it being used across a broad range of different areas throughout the building information life cycle, from concept design all the way through to the building and operations side and that’s kind of what I want to share with you today, is people who are using it across a variety of different areas, digital twins to training to visual communication and virtual collaboration and share with you some of the ways they’re doing that.  Again, it’s because of these great data aggregation tools that are out there, that allows Unreal to really come out and shine, and that’s always part of these processes.  So, I’ll share a number of them with you today.

What’s really funny about the way the tool is developed is, we find ourselves speaking less and less about the visual fidelity of Twinmotion and Unreal, mostly because we feel that it’s a given.  The visual quality that you can get out of these tools, this is Twinmotion, it comes… it’s just known or it speaks for itself.  So, we find ourselves spending less and less time talking about this aspect of the visual quality and fidelity you can get out of Twinmotion or alternatively, what you can get out of Unreal Engine and finding ourselves focusing more on the UI and the use cases around pushing it beyond that.  In saying that, for anyone who is really interested in the arch-vis side, I just wanted to throw these two examples out there just to interest you.

We recently worked with the Mill on creating videoes released in Unreal Fest, just about the different use cases or different industries using Unreal Engine and what’s special about this collaboration work we did with The Mill, are these effects that you’re seeing, the construction of the buildings, the ripple effect, the Inception/Dr Strange – esque style creation.  We actually have a webinar which shows you how we created it.  Similarly, with this example here, it’s using real time retracing.  We have that asset for free on the Unreal marketplace.  You can go in and see exactly how it is that they were set up, the assets that were used, the light settings, to try and help you understand how these visual qualities were developed and achieved.  Like I say, we rarely find ourselves focusing on these any more.

We find ourselves more focused on these amazingly fun areas of immersive design.   People are using it in the early stages or using it in the more immersive sense from people like AF Consult who are using VR as a way of designing with the client.  So, this is a GIS Map that has been brought in and they’re using it as a way to draw a map of potential transportation, road, rail links through that 3D geometry, but in VR in a multi user.  So, there are other people able to be in there at the same time to share and understand it.  All the way through to the work that I was part of, which was for the work of CallinsonRTKL where, we have people in virtual spaces all the time where what they’ve done is, they’ve built this tracking tool where they can see where people go in these virtual spaces, what it is that they look at, what objects it is that they’re hitting.

So, they can better define what areas are important, where people are not going, what are people looking at that they find interesting and then planning and designing around that.  Is there an area that people aren’t interested in?  Is there something we can do in that area to grab attention and grab focus and that’s really a great use case of what is coming out of the engine, all the way through to again, although we saw Speckle and the BHOM, there is a bunch of other data integration pipelines that focus more on the parametric building design.  This is a thing that was designed by Cornell University that works with City Engine and bringing in the archaicism, the parametric control that City Engine has and exposing those functions within the Unreal Engine.

So, lots of really cool use cases of what we’re seeing in the immersive design side, but I would say the biggest use and most beneficial use which we see, which again, Heatherwick touched on, is this idea to communicate ideas and to communicate information and vision, be it just in an immersive space, or be it by creating custom asset tools.  This is building asset management tool created by Cityscape, that was created for leasing managers to speak to future tenants, so they can explore the building footprint and see it in a one to one virtual scale.  But also linking that model, and linking that Unreal model to a financial model that what they’re then able to do is draw out new floor plates and that will then update in real time what the cost of that space might be.  This kind of integration in to real world date and giving context to that information is really what we’re seeing a powerful use of the tool being and then obviously exploring it, that new space and the engine, all the way through to the work that Arup are doing.  They have this amazing driving simulation game where again, because it was all created in Unreal in this 3D asset, they created this customisable driving simulator tool that was a fly through and a real time walkthrough, which they actually hooked up to an entire driving simulation game.

So, it’s just great to see the different ways that people are communicating and engaging with people around these ideas, all the way through to the work they’ve been doing with Accucities.  This is the newest tool planned city her they have a digital twin of London and a variety of different cities throughout the UK, and the way that their model links in with City Data to help future city planners see their future buildings and city data in context, it’s really exciting news.  The new tool planned city looks at integrating in with your actual models, your geometry and your information.  Whether it be a simple massing model, that you can then run a bunch of simulation tools on within the application, like visibility site lines, like it’s doing here, where your buildings are from, all the way through to if you want to see it more clearly, importing your own building design in here, and then running similar of the application simulations in that.

So, the communications side is a huge area, but I think the most important and most relevant to today’s climate is this idea of virtual collaboration, and this actually addresses one of the questions that was brought up by Paul at the end of the last session, which we will get to in just a second, but this idea of now that we’re not able to be together, being able to virtually meet in spaces and still communicate and explore ideas.  The Unreal Engine has the collaboration template which allows multiple users to be in a space like this.  Thea, the company that you see in front of you, they built on top of it.  They were like, this is great, we’ll add more functionality in to it now, allowing deeper control and integration of your ideas, to be able to share that in a virtual way, in VR to sketch and to annotate and to communicate, both in VR and in desktop, which I think is really exciting.  There are beta’s currently online and it’s called Big Room if you’re interested, and I can provide links to all of that afterwards.

But I guess the big thing that is really important and really cool in this day and age, is there is no longer the need to have big powerful gaming computers in order to view the content, and the example I want to share is this example by a company called Pure Web, who use a new feature of the Engine, called Pixel Streaming, or their own version of it, where you can load your executable, Unreal instance on to a server and then share that in a web based application.  So, this is running on Google Chrome where all the functionality of the experience, the sales configurator, the quality and the fidelity, is all running on a browser and through a web link that you can just share, and you can do this fresh out the box, using this tool called Pixel Streaming.  There are lovely tutorials online about it.  You can then look to post it on your own servers to much larger audiences, using AWS instances, or it gets a little bit more technical which is why things like Pure Web exist.  They are there to work with you, that you give them the visual content that you want and they’ll control the back end, the servers, the hosting, the graphics so that you can then go around and share this content.  So, lots of different avenues to explore.

The last one which is actually fresh off the press actually, it was something that happened about a week ago, which is people hosting virtual events and talks.  This is again, a screen grab to show this was on a web browser of the latest… a world digital built environment 2020.  It was created in Unreal and hosted and allowed users to explore this virtual space created in the Engine and witness talks, virtual production techniques like green screen included in to the platform so you can view it in the browser, navigate and walk around and be able to experience presentations in a much more one to one basis.  We are seeing a lot more of this emerge which is really exciting.

The last one again is just other people are looking at these virtual collaboration tools.  So, we have a free one, ESRI have created a free one for the Engine as well.  Thea have worked on a much bigger one.  I think Space Form by Squint/Opera are doing again a much higher level professional one, so lots of different options to explore and use the space, whether or not you want to create it yourself or whether or not you want to use one of these great presets that are in front of us.

So, that’s virtual collaboration.  A lot of our conversations seem to be focused on that at the moment for obvious reasons and although this may not pertain much to the work that you guys do, I always keep it in because I love the use cases and the different ways that people are using the Engine.  So, this we’ll commonly see being used by our real estate and their building operator use cases, like Aedas Homes who… like the tools we just saw, they mixed virtual production with the Unreal Engine that they’re able to bring in their 3D assets of a building, but the woman you’re seeing on the screen isn’t a recording, it’s a live feed to their studio and you can ask her questions.  Does this lovely couch come in a mahogany green?  Yes it does.  Then you can change it.  So, it’s kind of just bridging this ability.  When we can’t be together, what’s the best way to engage with people and talk about big ideas?  All the way through to the work that Line Creative have recently done.  We’ve all seen AR before, this has to be one of the more high quality ones I’ve ever seen being done, just the level of quality they’ve managed to retain from the Engine in to these AR applications, and the tools around this when the Engine are constantly getting better, which basically means you can have these AR models out in the middle of a city and not need a representation like a sheet of paper in order to load the model up, which is super exciting.

Then you have Imerza.  This is an awesome one, mainly because it’s a 3D printed model.  This is a 3D printed model that they use 12 laser projectors to project on the image on the Unreal Engine.  So, they’re able to actually communicate with large audiences and large groups of people around a fixed master plan.  Probably the closest to masterplans I’ve ever seen but it links in live with the Engine.  It’s real time, so that view you’re seeing is from the Engine.  You can change it and move it around and it updates on the model itself as well, which is what we really like to see.

For those of you who are looking ahead to the future trends, we’re having more and more conversations around this idea of digital twins and smart cities, which is really exciting.  We’re so glad to be a part of it, because we see a lot of digital twin solutions going around where people are bringing in geometry and trying to represent sensor information so it’s understandable, and this is where I feel gaming engine technology comes in to its own.  The ability to put sensor information in to context in a controllable UI format is really how you get understanding of that data, and how that becomes information.

This is a company called 51 World who have built an entire business around working with clients and developers on bringing their physical assets in to a digital UI.  So, they can control them, they can visualise live sensors in the Engine, see how they’re performing and interrogate it from a very small scale, like a floor to floor kind of building, all the way up to the much larger sizes and scales.  So, they’ve done it right up to these bigger building things, all the way through to… I think we released a video online a couple of weeks ago now, how they had done it to an entire city.  It’s massive.  So, yes, they’re building digital twins for cities, which is really exciting to see what they’re going to do with it next.

I realise we’re running out of time and I just want to leave it open just a little bit.  I’ll jump in to our last one which is scale and size as open worlds ability.  Talking about data aggregation of Rhino and Grasshopper, the ability to work with those tools including things like LIDAR scanning technologies.  So, we can now import LIDAR scans.  This is run by a company called Virtual Wonders, and the model you see in front of you, first of all, it’s real time and you can walk around it in a VR headset, real time, but that’s points that you’re seeing.  That’s not measures, or points converted to measures.  The point density is so tight in this that it actually appears as an object.  So, as scanning technology increases, the abilities of what the real time renders Engines can handle, is also increasing.  So, a lot of people are pushing the boundaries on this.  So, all the way up to what Build Media have done which is build the entire Wellington City in New Zealand in Unreal.  It’s not just Wellington, it’s the entire country they’ve put in, using a very streamline process of GIS data, photo scans and BIM information on to this refined workflow that they have this entire running.  This isn’t a video animation, this isn’t all just special effects and cutting especially.  We’ve actually kept in the views from the Engine which shows it running at 50 frames per second rate as you go through this model.  What is great about the one platform, many assets, they’re now looking at working with Wellington City Council and integrating IOT information to turn it in to a digital twin.  So, the future of where your models can go are amazing.

That’s kind of everything I want to talk about.  I realise I’m coming up to my half hour slot.  The last thing I want to talk about is really the importance of this slide which we spoke about at the beginning which… if you’re not at the Unreal side of it and you’re working in design tools like Rhino & Grasshopper, Twinmotion is a great start for you.  The Unreal Engine is a great adopter, but there’s a slightly steeper learning curve and that’s where you rely on your digital specialist team if you have them, or if you’re very technically savvy and good with the science and technology.  It’s great to jump in because the potentials are limitless in what you can do.  But the whole idea is that you can work with one and translate in to the other.  It’s really the future where real time rendering technology needs to go.

And the good news, the great news, is Unreal Engine is free, completely.  It’s open for anyone to use.  I love this fact.  Unless you’re creating a game that you’re going to be releasing on the PS4 or XBOX generally, we don’t want to hear from you as far as licensing goes.  If you’re releasing a PS4 title of the newest building then maybe we should have a conversation, but in general, use it, develop it for what you need and we don’t ask for anything in return.  It’s there to use.  It’s an open source tool and you can add to it and build on it in the marketplace, and that’s what I love about it.

The Twinmotion side, it’s not free.  It’s a paid asset.  It’s free to try, but I guess the good news which was brought to my attention today actually, is that Twinmotion is free to Rhino users.  We have a new tie in which we’re doing for a limited time, so any Rhino 5 or 6 owners can get a perpetual license.  It’s not a subscription.  We don’t do the subscription thing with them.  Twin Motion, you have the software and it’s yours forever essentially, absolutely free.  So, I’ll make sure the link is in the chat at the end of this.  All of your information is there and how to use it, and with that version that you get, you’ll get all the updates up until the end of 2021, which again is great news for all.  So, that’s everything I want to talk about.

The last thing which I kind of spoke about very quickly earlier on is this idea of MegaGrants.  We’re very big in supporting innovation in the industry in a whole.  You saw earlier on, we’re supporting HOK with Speckle, a number of the people you maybe have seen in the presentations are also recipients of this.  It’s a pot of $100 million in grants that we allocated to drive towards developers to essentially create amazing things using these tools.  So, if you have a great idea, if you’ve got something that you really want to experiment and try, we encourage you to create a proposal and send it in to our Epic MegaGrants team and again, you could be lucky and receive a grant that allows us to go in to those developments around integration and interoperability, like Speckle.  So, we’re really excited about exploring that.  We’ve seen some great stuff come out of it and we encourage everyone to have a think about how you can push the bounds on solving these large scale problems.

Lastly, I can’t actually answer anything on this, because it’s currently heavily under development but I’m sure everyone saw the Unreal Engine 5 which is coming out in 2021 and this is the future of where the Engines and real time Engines are going.  So, if you haven’t seen any of the footage on it, go and have a look and this is what we’re walking towards and the Unreal Engine 4 is a great place to start because the integration and switch over will be seamless, everything will work and translate across but with all the great benefits.

That’s me.  I think I’m bang on my half hour.  I’ve done this a lot so I think I’ve timed it pretty well.  So, I’ll stop my screenshare and go back to Paul and open up for some questions.

Paul:  Great, thank you David.  It would be excellent if Silvia and Pablo could join us again as well, because there are some other questions that have gone back to them as well.  Brilliant, thank you Silvia, thanks David.  Excellent, loved the announcement about free licenses of Twinmotion for users of Rhino 5 and Rhino 6.

DAVID: It’s awesome.

Paul:  Let me put the link in the chat so everyone can have a look at it and explore it, and yes we can jump to questions.

I’m going to ask this question here because it was marked as an urgent question but I think it’s a question for you Silvia.  How do you quickly replace the bubble trees from Rhino with the Unreal Engine tree asset?  Would it replace all objects at once or is it a more manual process?

SILVIA: So, as Pablo mentioned, we can us Mindesk for that and we can replace it as in having the name of the tree open real and placing it in to Mindesk, it will then literally change as we speak.  It will go in the moment that you just connect everything in Mindesk.  But if you are not using Mindesk, then you just can have little dots and then paint in the pocket of the dots and you would create the trees.

Paul: Thank you.  I wanted to mention… I mentioned this right at the start.  This event is being recorded.  So, we will be posting a link to everybody, where you can watch this again.  A question on coding experience.  So, how much, and what type of coding experience is useful for people who are interested in Unreal, and is any experience like that required for Twinmotion?

PABLO: I would say none.  There is no coding required to jump in to either of these softwares.  I think that’s the beauty of them.  That doesn’t mean that you can’t go further if you want to go but…

DAVID:  The thing I like about it is you guys who use Grasshopper should be familiar.  It’s a very visual, scripting based UI that goes in to it.  You can of course go in to the coding.  This is where your expertise in… you can do a lot more with it with C++ or using the U Sharp wrapper that allows you to translate that in to C++.  You can get in to the coding but by no means do you have to, do you need to.  In fact I usually see if you have to then, we need to be working harder as Epic Games to provide that solution for you without having to code.  So, that’s our stance.  It’s blueprint and visual scripting which I think people here should be familiar with.

Paul:  Many will be.  Okay, was there a mention of a timeframe for a Twinmotion to UE bridge?

DAVID:  So, we’re saying I think, 2021, early at the moment.  We’re working on the beta at the moment with a number of people.  So, we’re exploring it but there’s no official announce day yet.  It’s just in the works.  Pablo and Silvia, I can’t remember if you were part of that but it’s in the beta at the moment.  No official announce date.  Apologies.

Paul:  Okay, there’s an interest in MegaGrants.  Are there any requirements?  Do you look for particular skill sets?  Do you look for particular experience?  Leonardo is very interested in your MegaGrants.

DAVID: Generally, there are some requirements that we look at.  Most of it is just dependent on… we care about a number of things.  First of all we care about innovation.  That’s one of the big things.  We want the idea to be innovative and generally we also are very preferable because we’re very open and we love it to be altruistic.  We love it to be able to benefit an industry or a wider group of people or by making it free, or an open source thing.  Skills that we look for in people, is only that either you have the technical skills to explore or are willing to work with parties that do.  We’ve seen a mix of MegaGrants that look to include and work with companies and developers who use Unreal Engine.  We’ve seen it from Unreal developers alone.  So, we’ve seen it across the board in a variety of different ways.  I would say the importance really comes down to the innovative and altruistic side of the proposal, the expertise does but it’s a good question.

Paul:  Thank you.

DAVID:  I look forward to seeing his proposal.

Paul: Yes, I think one might be on its way.  Is the TM / UE bridge available to the public?

DAVID: Not public beta yet, no.  We’re running it with a number of people who are obviously heavy Twinmotion and Unreal users.  At some point that will change, but again, it’s early days at the moment.  So, trust me, whenever… it will be announced and shared on all our channels when we start to have something.  With Epic Games, we’re… for those of you who are more familiar with the Engine, we’re really open with including people in our previews and early versions of things.  You can download the newest version of Unreal 426 preview, which is not the official one, but it’s the beta one for people to start exploring.  So, we are really open for people sharing these things and once it becomes available, you guys will be the first ones to know about it, because you will be made aware.

Paul:  Excellent.  Is it possible to say a little bit about hardware?  Pablo, perhaps your experience about what hardware is required for good performance with Unreal?  I guess that’s workstation rather than headsets?

PABLO:  Silvia?

SILVIA:  So, basically, we are using any Alienware or Dell Precision.  But as well, we can run it in our studio computers.  It will be a bit slower, but you can open it anywhere basically.

PABLO:  Maybe it’s beneficial for the audience Silvia, if you can maybe mention about the size of a file so the…

DAVID:  I guess that’s the important thing.  I mean, a, it depends on what you do with it.  But the instances where you see build media bringing in that level of fidelity to an entire city will obviously require a little bit more than the smaller projects.  Generally, what Silvia said is a good general all round, is that you need… I would say you’d have to have a reasonably good graphics card, but Alienware, MSI, laptops can run it quite happily and functionally all the way up to the professional grade workstations.  But that would be the one limitation that I would say, if you’re working on these masterplanning projects and want to bring in 20,000 assets, to save you time and optimisation using higher grade computers will obviously be key, but there’s also a number of great optimisation tools within the engine that allow working with large files and assets easier.  LODS, culling, all of these things will allow you to explore the models and work on the models without needing the high level graphics cards, but it varies.  It’s use case by use case.

Paul:  Okay, thank you.  Is there still a place for tools like 3D Studio Max?  It seems your visualisation workflow has changed a lot, so how do tools like that still fit in to the workflow that we’ve seen tonight?

DAVID: I can jump in on this one.  In general, as Unreal Engine, we rely on everyone’s tool sets.  If a company is using 3D Studio Max, then great.  We want to be able to support it.  The problem is with these softwares, they’re probably more tailored towards other industries or other specialities.  As the architect grows and the profession matures, the computational side is just being more dominant and a much smarter and faster way of working.  So, we’re seeing more people using that tool now than 3D Studio Max.  However, you speak to our visual production guys in the movie industry, they’ll say the complete opposite.  This is where 3D Max and Maya is maybe a little bit more dominant.  So, we don’t see it ever going away and we will continue to make adaptions in to it, but I see in the AEC and Silvia and Pablo will agree that this comp designer role, to put it in more context, in gaming, we talk about next gen consoles, all the time.  The next generation is coming.  We very much see these computational designing tools like Rhino and Grasshopper being the next gen of architects.  So, these things all come in to fruition more and more.

PABLO:  I agree.

Paul:  Thank you.  Is it necessary to have RTX cards?  I guess that’s specific to NVIDIA boards.… go on David.

DAVID:  It’s not and it is.  If you want to start to push the boundaries on real time ray tracing, then yes, you are definitely going to need RTX software.  This always comes down to what it is you’re looking to do.  What is the output?  The higher graphical, visual quality, the larger the model and asset, will usually require it.  So, there is comparison charts on the Unreal website that you can kind of see what works and what features are needing particular tools but I would say there is a use for it, and don’t rule it out if you want to do real time ray tracing which is awesome.

Paul: Is there a possibility to live link Grasshopper and Unreal Engine without Mindesk?  I think…Pablo, is this possible?  I guess there’s not going to be as nice and slick a solution as there would be in using Mindesk?

PABLO: David mentioned a couple.  So, the BHOM is one way of having this direct link.  We actually used it.  So, w had our mini project going on with Buro Happold trying to test and develop it with them and so that’s one way of doing it.  You mentioned Speckle as well.  We haven’t tested the connection with Speckle, but it seems like it’s doing it as well.  And also Rhino Inside.

DAVID:  What I was trying to get at with these different solutions is, they require different levels of expertise and knowledge.  Speckle and the BHOM, they are great but there’s a lot of understanding that you need to get around those tools and how you use them, whereas what you see done with Mindesk is a very simple, quick, efficient way of doing it, out of the box type thing.  And Rhino Inside requires knowledge of Visual Studio, and U Sharp.  So, I guess there is lots of different solutions on how to get this real time bi-directional link by Grasshopper and Unreal but it really depends on your knowledge and the area, on which solution is the best one and I think they’re all great.  It all just depends on the use case that you’ve got and the experience and time you have to invest in to them.

Paul:  Questions are flying in at the moment.  How is the lighting typically handled in the scenes today?  Is it baked, or real time?

DAVID:  How do you guys do it in Heatherwick?

SILVIA:  If we are showing the real, we don’t bake it, but if we are doing a proper animation, yes.

DAVID:  We’ve seen a lot of new tools coming out in this area.  I think there was a light mask rendering tool which is being released with 4.26, which allows you to have a more real time approach on base lighting quality. But it always comes down to optimisation of the Unreal asset.  So, if you are trying to make your Unreal files more streamlined and get those high frame rates, dynamic lighting is very tough to do, because it requires higher graphical patterns.  Baking your lighting in, although it’s fixed, creates on a higher level of fidelity, which if anyone has seen Unreal Engine 5 trailers, knows that’s what Unreal Engine 5 looks to completely blow out the water, by allowing that high quality global illumination, control in real time rather than have to bake it in. So, that will soon be a thing of the past, but room for both.

Paul: Question for Pablo.  How is Heatherwick accounting for occlusion culling in AR scenes and the question goes on, ARCore uses depth API and ARKit is using LIDAR.  Are you implementing these new API’s in your process?

PABLO:  So, I think rather than going through the details of each part of that question, I think it’s better if I describe how we are using AR at the moment.  So, we are using Fologram to connect Rhino with HoloLens and mobile devices.  So, for that, we are controlling absolutely everything through Grasshopper.  But we are also starting to use ARKit.  So, since the beginning of this year, we have been testing different platforms and ARKit is one of the ones that we like the most which is something that the person in the question also mentioned.  And the bespoke tool that I presented, which is a web based tool, it’s again the product of a one day hackathon.  So, it’s really on its early status.  It’s in sketch mode and so basically we will be addressing all these questions as we go ahead with the project.

Paul: And a linked question for David, is Unreal working on blueprints to account for these new occlusion culling technologies?

DAVID:  Probably not blueprint things, but again, I think this is something that will come in to more the Unreal Engine v5 capabilities.  But not that I’m aware of is there direct focus on it, but whenever there does, it will be more feature of the Engine rather than a particular blueprint option.  I would hang fire on that one.

Paul: Okay.  Those questions were from Chance.  Now, there was something else from Gabrielle for you David.  Is there an expected timeframe for when Unreal 5 beta will be available for partners and developers?

DAVID:  Again, unfortunately not.  It’s always a fun conversation to have with people.  Everyone wants to get their hands on it first.  We’re not… even in 2021, there’s not an exact timeframe that we’ve set for, beta comes out here, early release gets out here.  Again, the only thing I can tell you is that whenever we know you’ll know, because we make all these things very public.  The reason it’s not at the moment is because there’s nothing publically out there yet on this.  So, hang fire and keep an eye on the news is what I say.

Paul: Question from Reece, maybe for Silvia, can you mix and match assets coming in from Rhino with assets in Maya and 3d Studio Max?

SILVIA:  Yes, it’s like… if you are importing geometry from different softwares, that’s the question, right?

Paul:  Yes, what assets, so I guess geometry?

SILVIA:  Yes, obviously.  It depends how you’re importing them but you can import from every software and you can all have them in Unreal.  That’s the beauty of it.

Paul:  Another workflow question.  Is optimisation for the mesh distance fields that Unreal Engine uses, handled in Rhino, Datasmith, or is there a manual control for that?

DAVID:  I’m not quite sure on the part that refers to.

Paul:  Reece, if you want to word the question in a different way, I’m happy to ask again.

PABLO:  Is it asking where the optimisations happen?

Paul:  I’ll go on to something else and come back.  We have a poll for everybody and I wanted to mention a couple of things.  Thank you to all of the presenters.  Thank you Pablo, thank you David, thank you Silvia, fantastic presentations.  Thanks for joining us.  We will be back and we have something scheduled… although we don’t have a date yet, Grimshaw have agreed to present at the next version of this meeting which is hopefully going to happen in January, maybe sooner.  But you’ll be the first to know.  Do please either sign up for our newsletter or follow us on social media at Simply Rhino and you’ll hear about all this stuff.  As you know, we do Rhino training and we supply Rhino software, just to mention a couple of classes that are coming up there, Rhino Level 1, Grasshopper classes, and more advanced Rhino classes, all of which have been delivered live and online at the moment.

So, shall we go to Silvia and Silvia’s little explanation of Mozilla is going to work for us?

SILVIA:  How do we send the links?

STEPH:  I put the links in the hand outs.  So, there is a PDF in the handouts that has the links to the rooms in it.  Hopefully people will be able to see that.

SILVIA:  Right, so just if you guys want to have a little talk with us, and between you guys as well, just click on the links provided.  We have four different rooms.  The first one is The Vessel and the other ones are magic places that you can click and then we can enter through each room.  Just press enter the room.  You choose an avatar, I’m going to be Santa Claus if you want to find me.  You just enter the screen, allow your microphone and then you can navigate inside of each room as the same controls as in Unreal, A for left, d for right, w for front and s for backwards.  You can pan with your left mouse click and you can jump in to places with your right mouse click.  If you are close to an avatar, you will be hearing the voice closer and if you are far away, you will not be able to hear anything.

PABLO:  Can you also copy and paste the links on the chat box, because I don’t think we have the hand outs actually.

STEPH:  I can do that.

PABLO:  Are we all going to be in the same room or are we splitting?

Paul:  I’m going to go to the Spanish Port, because that sounds like Barcelona and I think I know who will be there.

SILVIA:  I am there.  So, I can see a lot of faces now.  This is quite fun.

Paul:  So, we’re going to close down this window, say goodbye from here and we’ll see you at Spanish Port, or the Vessel, or these other places very soon.

The post AR/VR for Rhino and Grasshopper UK UGM | October 2020 appeared first on Rhino 3D.

]]>
Grasshopper UK UGM | November 2019 https://rhino3d.co.uk/events/grasshopper-uk-ugm-november-2019/ Wed, 23 Oct 2019 13:24:21 +0000 https://www.rhino3d.co.uk/?p=1580 Grasshopper UK UGM – Special Updates from McNeel (RhinoInside, Sub-D, Rhino 7 and more..) Join us and McNeel at AKT II for this special Grasshopper User Group meeting with updates […]

The post Grasshopper UK UGM | November 2019 appeared first on Rhino 3D.

]]>
Grasshopper UK UGM – Special Updates from McNeel (RhinoInside, Sub-D, Rhino 7 and more..)

Join us and McNeel at AKT II for this special Grasshopper User Group meeting with updates from the team at McNeel, including details on RhinoInside, Sub-D, Rhino 7 and more.

AKT II will kick off the evening’s proceedings before we move on to McNeel’s presentation and an extended Q&A with you and the McNeel team.


 

  • Wednesday 20th November 2019
  • 18:30 – 20:30
  • AKT II, White Collar Factory, 1 Old Street Yard, EC1Y 8AF
  • You can book your place at the meeting HERE via Eventbrite
  • We kindly ask that all interested in coming to the meeting book only 1 ticket per person – space is limited and we want to have as many of you along as possible – thanks!

 


 

Presenting at the meeting:

AKT II p.art team will present a selection of recently completed projects and applied research in the fields of advanced manufacturing, Mixed Realities, and IOT performance sensing.


 

McNeel on RhinoInside, Sub-D, Rhino 7 and more…

  • Rhino/Grasshopper Inside Revit, Unreal Engine and more

Rhino.Inside® is an open source Rhino WIP project which allows Rhino and Grasshopper to run inside other 64-bit Windows applications such as Revit, Unreal Engine, etc.

  • QuadRemesher

  • Sub-D Modeling

Subjects that can also be discussed:

  • Cycles – Realtime Raytrace Renderer
  • Rhino Compute Service

 


Thanks to AKT II for hosting this meeting. You can see photos from our last Grasshopper meeting at Heatherwick Studio in July here.

The post Grasshopper UK UGM | November 2019 appeared first on Rhino 3D.

]]>
Grasshopper UK UGM | July 2019 https://rhino3d.co.uk/events/grasshopper-uk-ugm-july-2019/ Thu, 04 Jul 2019 11:07:02 +0000 https://www.rhino3d.co.uk/?p=1397 Grasshopper UK UGM – Special Update from McNeel (RhinoInside, Sub-D, Rhino 7 and more..) Join us and McNeel at Heatherwick Studio for this special Grasshopper User Group meeting with updates […]

The post Grasshopper UK UGM | July 2019 appeared first on Rhino 3D.

]]>
Grasshopper UK UGM – Special Update from McNeel (RhinoInside, Sub-D, Rhino 7 and more..)

Join us and McNeel at Heatherwick Studio for this special Grasshopper User Group meeting with updates from the team at McNeel, including details on RhinoInside, Sub-D, Rhino 7 and more.

Heatherwick Studio’s Ge-CoDe team will kick off the evening’s proceedings before we move on to McNeel’s presentation and an extended Q&A with you and the McNeel team.


 

  • Tuesday 23rd July 2019
  • 18:30 – 20:30
  • Heatherwick Studio, 356-364 Gray’s Inn Road, London, WC1X 8BH

 


 

Presenting at the meeting:

 

Heatherwick Studio Logo

Heatherwick Studio’s Ge-CoDe team will be presenting their on-going research on mixed reality and emergent technologies applied on recent projects.

 


 

McNeel – the presentation from McNeel will include:

  • Rhino & Grasshopper in AEC and a few User Projects

 

  • Development platform and Tools

  • New Developments & Frameworks (Rhino WIP)
    • QuadRemesher
    • Sub-D Modeling
    • Cycles – Realtime Raytrace Renderer
    • Rhino VR
    • Rhino Compute Service
    • Rhino/Grasshopper Inside Revit: Rhino.Inside® is an open source Rhino WIP project which allows Rhino and Grasshopper to run inside other 64-bit Windows applications such as Revit, AutoCAD, etc.

 


 

 

Thanks to Heatherwick for kindly hosting this Grasshopper UGM.

 

Thanks to everyone who joined us – great night, again! Check out some photos of the evening in this gallery:

 

The post Grasshopper UK UGM | July 2019 appeared first on Rhino 3D.

]]>
Grasshopper and AR/VR for Rhino UGM – February 2019 https://rhino3d.co.uk/events/grasshopper-and-ar-vr-for-rhino-ugm-february-2019/ Thu, 13 Dec 2018 12:33:06 +0000 https://www.rhino3d.co.uk/?p=1042 With Grimshaw, Fologram, Foster + Partners and Chaos Group For our first User Group Meeting of 2019 we are combining our usual Grasshopper UGM with our VR/AR user group. The […]

The post Grasshopper and AR/VR for Rhino UGM – February 2019 appeared first on Rhino 3D.

]]>
With Grimshaw, Fologram, Foster + Partners and Chaos Group

For our first User Group Meeting of 2019 we are combining our usual Grasshopper UGM with our VR/AR user group.

The group is for those who are interested in meeting in order to network, discuss and explore Grasshopper3d and virtual and augmented reality solutions for Rhino3d.

The meetings follow a simple format of at least one presentation from a customer with experience in this field, followed by group discussion and informal pleasantries.

Confirmed Presenters are Grimshaw, Fologram, Foster + Partners and Chaos Group.

Details: This meeting took place on Thursday 21st February 2019 at Grimshaw, 57 Clerkenwell Road, London, EC1M 5NG

Special thanks to Andy Watts and the team at Grimshaw for hosting this latest UGM.

Meeting notes are available at the bottom of this page.



Preceding this UGM there was a 3-day workshop with Fologram on 19th, 20th and 21st February 2019 at Grimshaw – find out all the details about this workshop here on the Simply Rhino site.


Presentation by Grimshaw

Within the Design Technology team at Grimshaw, the VR and computational design are key components of the work our team undertakes to support our design teams and research new ways of working.

Grimshaw VR Demonstration Shot

The use of VR has grown to become an well-established part of the design toolset at Grimshaw from internal reviews and design checking through to client presentations and stakeholder engagement. More recently, AR has shown the potential to introduce a new facet to this, overlaying our design information on a more readily understandable physical context, be it at full scale or otherwise.

From projects such as Waterloo International through to Dubai 2020 Expo opening next year, the work of Grimshaw has always had a strong relationship with computational design. Today, tools such as Rhino and Grasshopper are now integral to our everyday work.

Grimshaw VR Demo on Tablet

Recently, our in-house Design Technology team had been looking at ways of merging these two key work-streams together. Whether through bespoke in-progress workflows or through the use of more developed tools such as Fologram, we are actively seeking ways to enable our teams to harness the power of computational design tools such as Grasshopper in an immersive 3D environment.


Presentation by Fologram

Fologram is a toolkit that allows designers to build interactive mixed reality (MR) applications quickly within Rhino and Grasshopper. By providing users with access to device sensor data (spatial meshes, gesture events and computer vision tools running on camera feeds) as inputs to parametric models, the full ecosystem of Grasshopper plugins (physics simulations, structural and environmental analysis, machine learning etc etc) can be extended to run in mixed reality. Gwyllim Jahn and Nick van den Berg will demonstrate applications developed with Fologram by partners and clients that augment existing processes of design, modelling, analysis and making.

Using Fologram with mobile phone to design in Mixed Realities

Designing and making within mixed reality environments extends the skills and capabilities of designers and builders by improving spatial understanding of design intent and reducing the risk of human error associated with extrapolating 2D instructions to 3D form. These new capabilities dramatically improve the ability of conventional craftsmen and construction teams to fabricate structures with significant variability in parts, form, structure, texture, pattern and so on, and in many cases completely reverse design viability as impossibly expensive and difficult proposals become straightforward, low risk and cheap. Complex designs can now be fabricated on standard building sites, with cheap materials and tools, and without expensive expertise or design documentation.

Using Fologram to design in Mixed Reality

We will discuss work from Fologram that investigates the implications of MR assembly methodologies on architectural design through the lens of several architectural prototypes. Could making in mixed reality allow us to refigure CAD-CAM not as a means of working to high degrees of tolerance and precision but instead as a return to craftsmanship, intuition and reflexive making? How will the medium of MR enable new forms of collaboration between designers and manufactures, or between humans and machines? What new architectural forms might be found in this superposition of the digital and the craftsman?

Working with Fologram in Mixed Reality

At the end of the presentation there will be the opportunity to have a brief demonstration of the Fologram toolkit on the HoloLens and mobile phones, and discuss applications within research, teaching and practice.

Check out Fologram’s vimeo channel to see Fologram at work.

 


Presentation by Jonathan Rabagliati from Foster + Partners

The Bloomberg Ramp | Rising through the centre of the building, the distinctive hypotrochoid stepped ramp, animates the whole Bloomberg office space.  Fabricated with a steel monocoque, the ramp is clad in bronze panels. Their form is based on a mathematical curve called a hypotrochoid, that forms a smooth continuous three-dimensional loop between that rises up to the skylight. Each loop cuts through a near elliptical opening in the floor plate and these elements, rotating through 120 degrees on each level, which creates dramatic views that open deep into the building.

The ramp is central to the way Bloomberg chooses to operate, embodying a sense of movement and dynamism through its form and function. The ramp is conceived as a place of meeting and connection, between people and parts of the office. As the primary connection between the floors, it acts as a great social condenser for the building, bringing both life and light to the building.

The presentation by Jonathan Rabagliati charts the story of design through fabrication, using computation design, VR, laser scanning and metrology and close collaboration with structural Engineers and contractors to realise a remarkable design.

 


Presentation by Chaos GroupV-Ray Next: Immersing in Parametric Design

With 15 years on the front of expanding the possibilities of visualisation, Chaos Group have grown to have groundbreaking performance as the readily expected feature of every subsequent release. And moved to pushing the limits of what is generally possible to visualise. No better testament to that than V-Ray’s latest Next line, even more so – with the expected V-Ray Next for Rhino.

On February 21 CG Specialist Lyudmil Vanev is coming on stage with something way more powerful than a new version presentation, however exclusive it may be. Lyudmil is showing our whole take on the way designers can be seeing and experiencing their design. And adding substance to our concept of visualisation as a design tool, with an integral role in every stage of the design process.

You will get a detailed exclusive preview of the way how V-Ray Next for Rhino adds completeness to an approach we started in V-Ray 3 – the direct access to rendering from within Grasshopper, without the need to exit, bake, etc. V-Ray’s entry directly in the parametric toolset moves towards providing interactivity and a new depth of immersion and understanding of changes, parameter impact, and design evolution. Through simply giving a new set of eyes to the designer – to see all right where it happens, straight within the parametric script, changing in real time and with the most realistic materials, if needed. Which also makes the above the main reason and entry point into interactive immersive virtualised design with V-Ray Next for Rhino.
Operating straight within Grasshopper, V-Ray brings its complete list of features and powers, straight to rendering animations and supporting VR Scans. furthermore, it brings two major opportunities – GPU rendering for speed and computing power; and bridging to V-Ray for Unreal, to provide a seamless transition from the parametric plugin into interactive virtual setups.
So – interactive, fast, realistic, gamified parametric design. Firsthand, first time and with a hint on the next areas of research and development straight from the team.


 

 


Grasshopper3d and AR/VR for Rhino3d Meeting Notes | Grimshaw | February 2019

Georgios Tsakiridis | Grimshaw | How does VR make a difference in the design process?

VR and AR sit within the research cluster of Grimshaw’s Design Technology department, where they have champions of areas of interest and research. They try to be first adopters, and had an AR exhibition about four years ago and now use 360° VR scenes as a standard project deliverable. They also have a VR cave available when the project can justify it.

VR as a new way of working : In general, a small group within the practice will explore new technologies first and then roll them out across the practice. But VR can be a relatively simple technology which everyone see results quickly, so does it change what and how people design?

They set up rooms for Vive rigs and gave small headsets like the Samsung Gear to design teams. The use of these simple headsets mean that the design teams use them in the design process itself, and not just in client reviews. They started with simple workflows, using N-Scape and Iris VR, offering quick and reliable output and being accessible to the team.

For stakeholders, VR gives unmatched clarity, without the distortions inherent in CGIs.

MIPIM was a key first showcase, where they demonstrated a model of their Dubai Sustainability Pavilion, but the ‘Heathrow Horizon Community’ engagement was perhaps more important because they created a set of 360s of key passenger journey points. The ‘Horizon’ are a group of frequent flyers who were shown VR scenes of a generic airport pier environment, and were asked to assess their perceptions of the width, amenities, comfort and so forth. VR allowed swift engagement in the complexity of systems of an airport, with members of the group even being able to start to plot out airport layouts themselves.

These early experiments led to the development of a wishlist for VR in the practice. These included; better design tools, integration with Rhino and Grasshopper, easy to customise interfaces, scene interactivity, live linking between applications, and some Augmented Reality.

Mixed Reality with AR

The journey is now towards the mixed reality world of AR. To explore this, Grimshaw hires a specialist games designer and started working with Fologram, thanks in part to its easy workflow from Rhino. They saw its immediate potential so implemented AR on quick review sessions, e.g, dynamically adjusting a stadium roof – a technology that is far quicker than the equivalent 3D printing process for design review. They have also tried AR at the masterplan scale, being able to see the impact of adjusting the volumes of buildings in relation to one another.  AR here has a distinct advantage over VR in that the ‘sunglasses’ style of headset means that the user can stay in the conversation taking place around the model.

Grimshaw have also been developing custom apps and engaging with video game platforms Unity and Unreal. However, this is time costly and requires coders in the team. But it does provide for a degree of photorealism and the animation of elements such as the doors on an underground train. The user finds themselves in much a more immersive place than before.

Grimshaw feel that they are still in the ‘humble beginnings’ of working with AR as an in-house technology. They are still exploring the tools and workflow, but it’s a priority for investment. The ‘holy grail’ is that of interoperability: can you connect Rhino and Grasshopper with video game engines? Well, that has been happening for a while, but what has gotten the team excited recently is the ability to run ‘Rhino Inside’, with the software being called from within other platforms.

Go-Rhino-Go

‘Go-Rhino-Go’ is an open source GitHub project that was developed at a hackathon in New York in conjunction with architects from Foster & Partners and others. It allows you to call Rhino and relevant libraries to permit the real–time building of geometry in rhino from the Unity interface, and combining these two worlds in a collaborative situation. It will never replace Rhino, but it’s an in–between sketching tool, with a really big potential which they want to explore further. There are certain limitations due to how Rhino is developed but they are in discussion with McNeel and Go-Rhino-Go is open source so they’re keen to see a community grow.

The advantage of game engines is that they have a certain power to narrate and to communicate complex messages within a simple frame of constraints — so it becomes less about where you do the calculations, but about what systems like Unity can give us. So, as a result Grimshaw have just welcomed a game developer to their team, which is a new breed in the world of AEC.


 

Lyudmil Vanev | Chaos Group | V-Ray Next for Rhino

Firstly, it will be smarter, so smart that it takes optimisation decisions for you. A new asset editor allows common libraries, which are stored where you want, not in V-Ray. It features a spline curve editor for value manipulation (eg; hue, saturation) and they have added metallic PBR style shaders. There’s a light editor, where you can set up lights in the editor without making test renders in scene, and a lighting heat-map analysis tool as well as new multi-matte elements for compositing.

V-Ray next also has two new patented algorithms governing scene intelligence.  

There’s an adaptive dome light that can use image based lighting. There’s no need for light portals any more, just use the dome. V-Ray Next now has auto exposure and white balance for scenes, so V-Ray can create perfect lighting for you, and it will also handle the difference between interior and exterior lighting.

Next has cut render time by 2 to 5 times, even up to 11 timesin some cases. Next is generally 20-50% faster for exterior scenes. And with GPU processing, up to 18 times faster (again in 3DS Max). The general message is that you can achieve more with less.

Denoising was good in version 3.6, but had only one algorithm. It was perfect for cleaning up the end result of a visualisation process, but what about faster workflow? So they have added a new denoiser using Nvidia AI, which is fitted with thousands of denoising patterns.

VRScan GPU

Chaos Group’s material scanning technology has been in development for 10 years. You can put any material sample inside and VRScan captures mathematical data of every single direction. Clients used to complain that programmed materials don’t look like their material samples and you end up spending weeks tweaking them and they’re still not happy. But with the scanner they look real.

V-Ray for Grasshopper

V-Ray allows you to render grasshopper without baking geometry. This leads to the ability to create animation in Grasshopper and render it directly, by having a V-Ray Scene node Grasshopper. You can also create materials in Rhino and manipulate them in Grasshopper. Grasshopper can also control the lighting, camera and sun and again create dynamic scenes all without baking.

Overall, these new items are about a tenth of what’s coming…

V-Ray’s VR and AR Pipeline

Using the vrscene transfer format is a great solution for taking work into Unreal. It does have one limitation — that everything should have be a texture as Unreal doesn’t accept procedural defintions. There’s also still the need to export, as there’s no live connection yet. But the V-Ray scene file does contain all the geometry, lighting etc and V-Ray for Unreal converts shaders, lighting etc., into to native Unreal definitions. In the Unreal settings, you can directly select V-Ray denoisers and other features, and V-Ray will bake all of the lighting within Unreal — you can even manipulate the bakes in Photoshop as they are not hidden away.

Project Lavinia

This is a new real-time ray-tracing engine viewing system, based on Nvidia DXR technology. It’s a drag and drop viewer for V-Ray scenes created in any V-Ray platform. It can handle scenes with billions of polygons without prebaking, or faked reflections. Where is it going? Will it be useful? Feedback to Chaos Group please! They have the alpha for 3DS Max out already, and Rhino is coming.


 

Long Nguyen | Research Associate at the University of Stuttgart | C# Scripting and Plug-in development for Rhino

Long teaches classes which start assuming no knowledge of C#, but during the course of the workshops the students learn it and get to develop their own plugins. He also shows algorithms for computational design, to achieve logic not possible within the visual parameters in Grasshopper alone. He also teaches good clean programming practices, to enable the creation of plugins that can be packaged and distributed commercially. Example use cases might include getting elements to obey rules, e.g. don’t self-intersect, or to study liquid erosion of a terrain. The next introductory classes with Simply Rhino will be in June.

Advanced classes coming soon.

In September, Long will also offer advanced versions of the workshop, for example parallel computation in C# for proximity checking or how to make a Grasshopper plugin to undertake heavy calculations in the background without freezing the main user interface.


 

Jonathan Rabagliati | Foster and Partners | The Bloomberg Ramp

The project for a grand ramp in the Bloomberg building in the City of London started 7/8 years ago, but has its roots in work done by the practice 20 years ago at the Reichstag in Berlin, and later at the GLA Building in London. There they had learned some of the tricks about how to create a minimal and smooth appearance while satisfying code requirements for level sections within the slope.

At Bloomberg, the attempt was to build a building with huge internal area but which respects the medieval street pattern. In heart of north zone of the plan, there is a huge triangular space and atrium with ramp that rotates as it passes each floor. It’s not just a conduit for people but it’s also a part of the ventilation strategy.

One of the challenges was how to the get clients’ head round what they were designing. They did 3D prints and presentation models and they did lots of renders. But the development process was necessarily complex — having designed the model parametrically in Grasshopper, every ‘frame’ in the animation of the ramp was its own Rhino surface model — so there could have been an infinite number of different ramps.

Jonathan is passionate about curvature and to use ellipses as the basis for the form disturbed him. The inherent tightening and loosening of the curvature was no good, and he wanted a more elegant solution. So he did a curvature plot of the acceleration and deceleration of the curvature and it revealed unwanted kinks. So he used an equation that is in fact just like a Spirograph: rolling one circle around another. The ratio between the gears on the moving wheel versus the circular frame creating a trochoid. And in turn you end up with the setting out of the ramp, with the skylight above being defined in a similar way.

There was then a dialogue to and fro with Adams Kara Taylor engineers to refine and simplify the geometry. The beauty was that you could pull out the structural model and plug it into his hypotrochoid model, that would then update the engineers’ model at same time. This process eliminated lags in coordination, but required common naming conventions and a shared language to make the collaborations work. Rather than wasting weeks on coordination they could get on with building the Grasshopper model and doing detailed analyses of load cases for all 96 steps and knock-on effects on all the other steps. It created a matrix of data that could be interrogated, and the efficiency then freed up engineering resource to undertake a far more in depth study than is usually done. Overall, it reduced the uncertainty factors due to greater clarity.

Full scale prototyping was very important for user comfort, and also to convince the district surveyor that the proposed gaps around a glass infill panel at the landings of the ramp would be safe.

In a combination of precision and brute force, they ended up with using a contractor for the bronze cladding panels who was based in Japan, with the substructure being created at Littlehampton Welding, with the elements coming together after a series of overnight deliveries into the City of London. For the contractors, they made a simple set of instructions listing the variables for each element and diagrams—which became a 96-page method statement of how to build it.

During the design process, Michael Bloomberg visited a mock-up, and being of lesser stature, he questioned the height of the balustrade and wanted it lowered, to which the senior partner at Fosters said ‘yes!’, not knowing the consequences. But through another equation, the team were able to find a solution. In January 2014, they made a pioneering use of the Oculus Rift in one of first projects to test different options, not just for client review. They had to find a way of smoothing the curves of the lowered balustrade but retain the setting out at the floor levels. To resolve this they had to introduce an s–curve to smooth shapes—but it had to do so imperceptibly. So they tried various s-curves to maximise the effect and used VR to see which looked best.

The whole development process took months with the Rhino geometry eventually transferring to a fabrication model produced in Catia within a tolerance of .004mm. Then came the amazing bit: yes, the fabricators had an accurate model, but did they build it right? And will cladding fit? To check this, they did three 1 billion point laser scans of the installed substructure. They then put that dataset in Rhino and tested against their design geometry and colour coded it for clash detection. The maximum deviation was 24mm over 6 or 7 floors, meaning there were a few areas where had to alter the geometry. But by this time the very beautiful and very expensive Japanese bronze cladding was landing at Tilbury Docks and couldn’t be changed. So the adjustment process was to combine the scan and the solid model to create a virtual model where they could ‘jiggle’ the bronze panels using the minimum shift distributed across them all, and set up rules of how periodic 10mm shadow gaps could be tweaked accordingly.

Although they were using metrology with sub millimetre precision, in the end the specification writing was key. The spec just called for ‘a smooth continuous curve’ — just a few words as opposed to all that data..! And you find yourself reverting back to written definitions, such as “plus or minus 2mm” and end up arguing on site when the contractor points out that this actually allows for 4mm of mis-alignment, because its plus 2mm on one panel and minus 2mm on the next. The moral here is that for all the computational sophistication, don’t disregard the specs!

It’s all very well designing or making things with these tools, but the process of actually realising something like the Bloomberg Ramp is just as fundamental and crucial. And don’t lose sight of the fact that the end result is about simple human interactions: the ramp enables casual interactions and conversations to take place. And one final nice reward was that the plan of the ramp was adopted as a logo for the building.


 

Gwyllim Jahn | Fologram | Making in Mixed Reality

Fologram are building software for mixed reality devices, so designers can use them for design and making. They’re interested in how you go from design packages to making things in the real world, without 2D drawings.

The AR technology was originally always about helping fabrication and comes from work done by two Boeing engineers to enable accurate placement of systems within an airframe under construction. Now it can still be used for precise registration, but also for shared experiences and to build natural, intuitive interfaces.

Fologram work with the Microsoft Hololens, which offers precise tracking, but the downside is the need to develop in Unity. So they have made a bridge from Rhino and other platforms.

Their target is to produce reductions in time and cost risk in experimental architecture. A case in point is Frank Gehry’s Dr. Chau Chak Wing Building at the University of Technology, Sydney. There, the undulating curved brickwork façade had to be installed by an expert team with painstaking precision, meaning that a bricklayer who was used to laying 400+ bricks per day was down to 80 bricks a day.

There was a clear need for a way to make the process simpler and faster, and avoid the need that Gehry’s had of providing setting-out information for every single brick. And to find a way to be able to use less skilled employees, not just master bricklayers. And for them to be able to work in parallel. So Fologram did a small test build of a sinuous brick façade, using local ‘brickies’, where they were able to build in one day what would otherwise have taken weeks, because each of the crew could see a projected hologram of exactly where the brick should be. The brickies themselves were super-excited as they could use less skilled labour alongside masters leading to better fees, a faster installation and a better result for the architects.

Fologram also work with art fabricators, who can use virtual templates to rapidly develop work as they go, without a steep learning curve.

It’s a case of using old tools for new tricks. Can we rethink old design tools? Now you can stream a model to multiple devices so can have collaborate modelling without cad skills. With Fologram you can have three people work on one Rhino document simultaneously, just using three iPhones.

A classic test are three-dimensional Voronoi diagrams; can these be done quicker using these tools? Now you can combine the precision of digital modelling with the ability to overlay analog tools, all without 3D printing. They overlaid a Voronoi hologram from Grasshopper into the workshop of a Chinese fabricator, who just had to follow the hologram and bend the metal components till everything is just right. You can even then use Fologram to augment the physical object with AR animated elements, like a breathing skin.

And the system is very lightweight, with the ability even using a laptop to combine a live 3D scan of a space with Rhino models and interact with it using an iPhone. All of which can be done anywhere in world with just a WiFi LAN and a phone hotspot.

There’s a free mobile Fologram app available from their website and they are about to debut exciting new developments following the launch of the HoloLens 2.


 

Next Meeting! Our next Grasshopper User Group Meeting takes place in Manchester on Thursday April 4th 2019 at Arup. See here for all the details on the presenters and how to book your place.

The post Grasshopper and AR/VR for Rhino UGM – February 2019 appeared first on Rhino 3D.

]]>
AR/VR for Rhino UGM – September 2018 https://rhino3d.co.uk/ar-vr/ar-vr-for-rhino-ugm-september-2018/ Wed, 15 Aug 2018 10:24:39 +0000 https://www.rhino3d.co.uk/?p=987 With Bryden Wood & McNeel showcasing RhinoVR This is the fourth meeting of our AR/VR for Rhino User Group. The group is for those who are interested in meeting in […]

The post AR/VR for Rhino UGM – September 2018 appeared first on Rhino 3D.

]]>
With Bryden Wood & McNeel showcasing RhinoVR

This is the fourth meeting of our AR/VR for Rhino User Group.

The group is for those who are interested in meeting in order to network, discuss and explore virtual and augmented reality solutions for Rhino3d.

The meetings follow a simple format of at least one presentation from a customer with experience in this field, followed by group discussion and informal pleasantries.

Details

  • Thursday 20th September 2018
  • 18:30 – 20:30
  • Bryden Wood – 100 Grays Inn Road, London, WC1x 8AL (See map here)

 


 

Presenters at the meeting:

Bryden Wood

The Creative Technology team of Bryden Wood is a new digital group of computational designers working on advanced approaches to digital design.

Virtual Reality Scene from the VR team at Bryden Wood

Within the team, our XR specialists concentrate on building interactive and data driven applications across multiple platforms – VR, AR, mobile and web.

AR/VR Highway Scene from Bryden Wood

Together, we connect 3D project content from a range of authoring tools – including Rhino and Revit –with other data sources to create immersive and insightful experiences for project design teams and external clients.

AR/VR Scenes from Bryden Wood

In this presentation we will demonstrate our unique work flow that allows the different digital platforms communicate and contribute to each other in a smooth easy way. We will show our interactive applications and demo our latest VR and AR developments.

Augmented Reality Screenshot Bryden Wood

 



Robert McNeel and Associates
– RhinoVR: VR Tools inside Rhino 6 presented by Andy Le Bihan

McNeel (the developers of Rhino) have been working on some VR tools within Rhino v6 for Windows, the resulting plug-in is called RhinoVR.  Although this is an on-going project there’s some very interesting tools here for everyone interested in using Rhino as a VR platform.

Screen-grab of RhinoVR, the Rhino3d plugin for VR/AR

RhinoVR is a Rhino 6 plug-in which uses the HTC Vive or Oculus Rift head-mounted displays to render Rhino viewports in virtual reality.  Using RhinoVR it is possible to navigate around in the Rhino scene using the VR controllers (Vive and Rift). Via the controllers you can navigate, objects can be selected and moved.

Andy will be demonstrating these tools live, they’ll also be the opportunity to test them for yourself.

 


 

Thanks to Bryden Wood for kindly hosting this AR/VR for Rhino UGM.

 


Here are our notes from this meeting:

1. Bryden Wood presenters were Phil Langley, Head of Digital Delivery and Elite Sher, Head of VR and AR

Bryden Wood is a multi-disciplinary architecture and engineering practice, building airports, hospitals, schools and residential projects. They have developed a specialism in design for manufacture and assembly. They place a big focus on digital, with a new Creative Technologies team in the office. The team is working on algorithmic design and simulation, generative design approaches, modelling and analytics, connected technologies, interaction and configuration.

Elite Sher is in particular heading up the Unity development team at Bryden Wood. Her interest is in looking to push the potential of VR and AR beyond being an extended rendering platform. This reflects a change in the AEC industry generally, where there has been a move from 3D CAD to BIM, with the convergence of geometry and big data.

To support this, Bryden Wood want to create data driven interactive tools that can be everyday tools for the architects and engineers. The challenge is how to bring this content into VR and AR. There are many plug-ins for visualisation and linking to BIM, but they offer limited scope for customisation of tools and analysis of the data they transfer.

The current process is take 3D model data from BIM to another app (to keep textures) and then simplify the data. This is then brought into Unity. Although it is possible to write scripts to match up component entities, it is very long and tedious and requires specialised gaming engine skills. So Bryden Wood have developed a framework to export 3D data to JSON then to Unity for VR and AR. The data is stored in an SQL database which keeps information across teams concurrent and allows for different geometries to be overlaid on the base data.

This approach also allows for the creation of cross platform apps, and extensions to web, mobile and desktop platforms. A focus now is on creating an API for this Mango database.

The workflow for Revit to JSON allows specific areas and views to be selected for export, and also handles groups and assemblies. The same approach has been applied to Grasshopper, preserving all properties when exporting to JSON – so naming conventions and the file structure are preserved into Unity. They are following the 3JS schema, which is widely used for translating data for 3D objects and well is supported. The workflow can also process and upload multiple files simultaneously, optimised by splitting into batches and importing the JSON information into Unity with all of the metadata.

Once in Unity, it is then possible to create rigid body simulations, controlled by Grasshopper, and easy to understand for non-engineers. They principle is that design is undertaken by designers, in Revit. These design objects can then become grabbable in a VR environment, or viewable in AR, controlled by parameters set in Revit

Overall, Bryden Wood can identify four key benefits with their approach: A data-driven process; one that is interactive; is accessible to the everyday user, and as such it is a process that keeps VR-design in the realm of the everyday user.

 

2. McNeel’s presenter was Andy le Bihan, Developer at McNeel

Andy is specifically in charge of visualisation at McNeel, and his responsibilities have included Brazil, the rendering UI, Bongo and Snapshots in Rhino v6. Andy introduced the work that McNeel have been doing on VR support for Rhino, as well as a forward-look to Rhino 7 and other Rhino technologies.

The first thing to note is that VR for Rhino is not yet a finished product. Andy was able to show a demo of the technology in progress, but it’s not ready for use yet. For anyone thinking of using it as a new platform, there’s a way to go. It is however already available for free on GitHub in C++, but the APIs that have been developed are also available in C#. The code itself has not yet been released. So Andy and the McNeel team are very interested in feedback.

The Rhino for VR demo works using Steam VR on both the Vive and Oculus Rift. In fact, Andy and his colleague David at McNeel are developing the platform sitting side by side on Vive and Oculus.

What they have developed is a plug-in for VR. Running the VR command delivers VR right from a Rhino viewport. In VR you can select objects just like in Rhino and set up buttons to run commands, like move. Any command in Rhino can be mapped to any of the VR controllers’ buttons. “It all just works”.

Andy himself is an architect and was obsessed by Corbusier. He wanted to see if VR for Rhino could create same sense of ‘being there’ as the real Villa Savoye, that he has visited many times. He showed how a model of the Villa could be drawn with the standard Rhino display pipeline, straight to the headset. To underline; this is VR running directly out of Rhino.

Adding Grasshopper into the mix, you can see the geometry but also the curves and controls. Within the VR environment, you can even open up a Grasshopper canvas and using the VR controllers, move in and out of the Grasshopper window and adjust sliders.

The priorities for McNeel are to provide APIs and guaranteeing a level of performance, with a display fast enough for big models. To achieve this goal, they are looking at optimisations like catching shadow maps and lighting specifically for VR. What McNeel are not doing is developing a general VR interface. However, by giving away the source code under license will enable third party development – and developer Mindesk has been actively doing just that.

One issue is the adaptations to Rhino itself that might be needed to optimise certain processes, like Grasshopper. When a calculation takes a few seconds to run, this leads to lags and interruptions to the VR experience. This is due in part to the fact that Grasshopper was never intended to run in VR, and currently it occupies the main processing thread. But David who writes Grasshopper is thinking about moving calculations off the main thread, so in Grasshopper V2 it’s possible that VR could be run on the main thread instead. Multi-threaded Grasshopper is indeed coming, but the timetable is unknown — and the main Rhino would still need adaptation.

Andy also talked about the just-released ‘Rhino Inside’. Currently it only works on the Rhino7 WIP. It brings the ability to load up Rhino inside another application eg; Revit. When Grasshopper is running too, suddenly you get Grasshopper in Revit. They can interoperate due to their shared use of the .Net API. Once an object is in Revit, you can imagine how there would be connections. Real time conversion of Grasshopper definition dumps geometry straight into Revit. B-Rep objects convert to native Revit. Rhino Inside isn’t so much about code or features as a way of thinking. Andy can see how it could also connect to software like AutoCad, Excel or even Photoshop. They just can’t predict what people are going to do with it.

In addition, Andy discussed ‘Rhino Compute’ — using Rhino as a geometry calculation engine. You post a request via JSON etc to a server and you get a Rhino answer back. For now they have Open-Nurbs running in JavaScript, so you can load up a file and get data out of it. The intention potentially is for the platform to run on the general cloud, as well as on a local server.

Finally, an update on Rhino 7. Yes, there will be lots of new mesh editing tools in R7, but it’s an evolution not a revolution, “mostly it’s just typing”.

 


 

The post AR/VR for Rhino UGM – September 2018 appeared first on Rhino 3D.

]]>
AR/VR for Rhino UGM – June 2018 https://rhino3d.co.uk/ar-vr/ar-vr-for-rhino-ugm-june-2018/ Thu, 17 May 2018 12:19:27 +0000 https://www.rhino3d.co.uk/?p=942 This is the third meeting of our AR/VR for Rhino User Group. The group is for those who are interested in meeting in order to network, discuss and explore virtual […]

The post AR/VR for Rhino UGM – June 2018 appeared first on Rhino 3D.

]]>
This is the third meeting of our AR/VR for Rhino User Group.

The group is for those who are interested in meeting in order to network, discuss and explore virtual and augmented reality solutions for Rhino.

The meetings follow a simple format of at least one presentation from a customer with experience in this field, followed by group discussion and informal pleasantries.

Details

  • Thursday 14th June 2018
  • 18:30 – 20:30
  • AKT II – White Collar Factory, 1B Old street Yard, Via Mallow street, EC1Y 8AF

 


 

Presenting at the meeting:

AKT II

At AKT II and through our in-house parametric applied research team (P.art) we persistently explore new means of approaching engineering challenges. As part of this evolving process of expanding our toolkit and enhancing our design processes, we are interrogating Virtual and Augmented reality applications in order to evaluate their potential uses, benefits and drawbacks in the context of the AEC industry.

AKT II AR/VR with Rhino Presentation Image

While we acknowledge the significant advantages and prospects of these technologies being mainly used as visualisation tools, we also aspire to examine ways of utilising them as imperative early design tools.

Watch the AKT II AR Video Teaser here

On this presentation we will illustrate our thoughts on how Virtual Reality applications could supplement potential design interrogation exercises. In specific, how could they assist in informing early design decisions with basic structural engineering principles.

 

Foster + Partners

Foster + Partners has been exploring VR and other immersive technologies for more than two decades, having worked with Oculus Rift since the first release of its developers’ kit.

F + P AR/VR with Rhino Presentation Image

Using different devices such as Oculus Rift, HTC Vive, Samsung Gear, and Google Cardboard allows designers to visualise their content, test the wayfinding strategies, design comfort and spatial experience with users, while clients also become more involved in the design process, enabling faster and richer feedback.

This talk will discuss the integration of VR tools into project workflows, reflecting on the challenges and opportunities offered by immersive technologies to explore, alter and test design alternatives within a collaborative framework.

 


 

Thanks to AKT II for kindly hosting this AR/VR for Rhino UGM.


 

Here’s our notes from this meeting:

AKT II – Spyros Efthymiou

AKT’s in-house parametric applied research team “p.art” have been experimenting with VR for four uses; Illustration and Communication, Design Interrogation and lastly Design Enhancement (designing within VR).

For design interrogation, AKT have developed a bespoke bi-direction real-time workflow between Rhino and game engine Unity (running an Oculus Rift) via Grasshopper. Design changes made either in Rhino or Unity are updated in the other simultaneously. The aim is for seamless integration where Rhino becomes the modelling interface for Unity.

Unity was chosen due to the simplicity of its C# programming, and large community but AKTs approach is agnostic in terms of game engine.

Model built similar to a 3D printing model optimised and complete.

For design enhancement, they are using Unity hands interaction to design in VR. Options can be explored without upsetting the physical modelmakers.

Again they are modifying the Rhino geometry by modifying in Unity in real time. They can stretch, copy and rotate – not just manipulate.

A key benefit is being able to change locations – say model from the rooftop of an adjacent building.

It is particularly powerful when adjusting parametric assemblies in real time – being able to see downstand beams change in depth as you extend a floor slab, or a core changing in area as you increase its height, or a beam grid dynamically recalculating as you move columns.

Overall the feeling is that to be a useful tool it relies on a seamless workflow.

 

Fosters + Partners – Amy Elshafei

Foster + Partners have been using VR as a design tool for projects and VR review has even become a contract requirement.

They use it for internal meetings but also for user testing employing a variety of approaches to change and develop the design. A particular use-case is in developing wayfinding where diverse user personas can be studied. Having human figures in the simulations was of clear benefit.

They use a variety of workflows in and out of Unity from Revit and 3DS Max, but almost always via Rhino in one way or another. Foster + Partners had used CSV exchange between Grasshopper and Unity, but it wasn’t real-time so they developed their own DPU plugin.

Solo viewing is done via Oculus, or Cardboard – where they use very light surface models since it’s via a phone. The data is stored as a 1.5GB app. fully loaded on a phone, not streamed via the web.

For multiple person experiences, they use a Solus dome, which they find stimulates client engagement. Although not immersive, the Hololens is also good as it can be experienced by more than one person, and they can be in different locations.

In all, VR enables clients to achieve faster decision making, with better spatial understanding. It allows for better communication and design feedback from users, but it hasn’t currently reduced the requirement for static renders also.

The post AR/VR for Rhino UGM – June 2018 appeared first on Rhino 3D.

]]>