UGM Archives - Rhino 3D https://rhino3d.co.uk/tag/ugm/ Rhino 3D Mon, 11 Jan 2021 16:57:23 +0000 en-GB hourly 1 https://wordpress.org/?v=6.8.2 https://rhino3d.co.uk/wp-content/uploads/2018/03/cropped-rhino3d-site-icon-32x32.png UGM Archives - Rhino 3D https://rhino3d.co.uk/tag/ugm/ 32 32 Rhino UK User Group Meeting | December 2020 https://rhino3d.co.uk/events/rhino-uk-user-group-meeting-december-2020/ Tue, 24 Nov 2020 10:25:15 +0000 https://www.rhino3d.co.uk/?p=1875 Rhino UK User Group Meeting with special Rhino v7 demo and update, presentation from Mamou-Mani plus a live Q&A with McNeel.

The post Rhino UK User Group Meeting | December 2020 appeared first on Rhino 3D.

]]>
Including Rhino v7 Demo, plus live Q&A with McNeel.

This Rhino User Group meeting featured a customer presentation from Mamou-Mani and a special demonstration of the now shipping Rhino v7 from Simply Rhino, followed by an extended and live Q&A with our audience and the McNeel (developers of Rhino) team.

First up was Arthur Mamou-Mani presenting the latest projects from his eco-parametric architectural London based practise, Mamou-Mani.

The second part of the meeting was the chance to see some of the latest and greatest tools in Rhino v7 and then to hear from and ask questions of McNeel, the Rhino developers, themselves!

  • The Live Event took place on Wednesday 16th December
  • Time – 18:30 – 20:30 (London, UK time)
  • The Meeting was recorded and the video is shown below

Presentation #1:

Arthur Mamou-Mani is director of the award-winning architecture practice Mamou-Mani, specialising in a new kind of digitally designed and fabricated architecture. He is a lecturer at the University of Westminster, and has given numerous talks around the world on eco-parametric architectural practice.

Mamou-Mani’s team specialises in parametric architecture and the use of the Grasshopper plugin for Rhino is essential to their practice. The plugin eco-system has helped them to evolve designs based on rules and parameters, echoing natural processes. They continue to explore new technologies, such as the open-source Silkworm plugin for 3D Printing which Arthur developed with Adam Holloway.

In his presentation, Arthur will discuss how Mamou-Mani’s use of tools such as Silkworm, Karamba 3D and Intralattice were integral to the making of groundbreaking architectural projects around the world. He will touch upon his recent collaboration with Studio Precht to create ‘Sandwaves’, the world’s largest 3D printed installation made entirely from sand for Diriyah Season in 2019.

In 2019 Mamou-Mani also collaborated with fashion house COS to create ‘Conifera’ shown at the Salone del Mobile in Milan. Made from over 500 ‘biobricks’ utilising both Silkworm and Karamba, the installation is the world’s largest PLA 3D printed structure to date. This year saw Mamou-Mani design ‘Catharsis’ in V.R. – ‘an amphitheatre made of amphitheatres’ – that spearheaded Burning Man festival’s first ever virtual reality event held in the Alt Space metaverse.


Presentation #2:

Rhino v7 Demo – Here’s some of the improvements we’ll be taking a close look at within the Rhino 7 presentation:

  • SubD including the new Multipipe function
  • Precision SubD and advantages over previous v6 based modelling strategies.
  • Quad Remesh
  • Named Selections
  • Layout Improvements
  • Refit Trim
  • Edge Continuity Analysis
  • Rendering and Display Improvements
  • Layout Integration (new in Rhino 7 for Mac)

Meeting Presenters and Panelists:


Watch the Full Video Recording of the meeting here:


Meeting Organised by Simply Rhino

Meeting Sponsored by PNY and BEAM

Check out our last AR/VR for Rhino User Group Meeting (October 2020 with Heatherwick Studio and Epic Games) by watching the video here.

The post Rhino UK User Group Meeting | December 2020 appeared first on Rhino 3D.

]]>
AR/VR for Rhino and Grasshopper UK UGM | October 2020 https://rhino3d.co.uk/events/ar-vr-for-rhino-and-grasshopper-uk-ugm-october-2020/ Mon, 28 Sep 2020 15:03:42 +0000 https://www.rhino3d.co.uk/?p=1781 Join Simply Rhino, Heatherwick Studio and Epic Games, for our live & online AR/VR focused Rhino User Group meeting. This was our first online AR/VR User Group meeting and Heatherwick […]

The post AR/VR for Rhino and Grasshopper UK UGM | October 2020 appeared first on Rhino 3D.

]]>
Join Simply Rhino, Heatherwick Studio and Epic Games, for our live & online AR/VR focused Rhino User Group meeting.

This was our first online AR/VR User Group meeting and Heatherwick Studio started the evening’s presentations followed by Epic Games (Developers of Unreal Engine and Twinmotion). The meeting finished with a Q&A session with our three panelists. The meeting date was Thursday 8th October 2020 (18:30 – 20:30 (London/UK time)).

  • For the video recording of the meeting please go to the foot of this page

 

Heatherwick Studio Logo

Heatherwick Studio has been working with game engines as part of their design workflow for years now and have developed custom design workflows and techniques that enable these processes.

Silvia Rueda will provide an insight on Heatherwick Studio’s use of Immersive Media, with a focus on the role of landscape design and the use of Unreal within its design process and workflow.

Image: Courtesy of Heatherwick Studio


David Weir-McCall from the Epic Games Enterprise team will take a look at the many ways that people are utilising the power of the Unreal Engine in the AEC industry to go beyond visualisations, to help bridge the gap between ideas and reality.

Looking at use-cases in the industry we will explore the different integrated workflows with Rhino and Grasshopper and how they are being used to communicate ideas, design and build in real time, and link up to sensors to create fully functioning digital twins. This includes covering works by relevant partners including Mindesk, Speckle & BHoM.

Images: Left – Courtesy of AHMM; Right – Courtesy of SPP and Imerza.


 

Meeting Presenters:


 

Organised by Simply Rhino

Sponsored by BEAM

 

 

 

 

 

Thanks to both Heatherwick Studio and Epic Games for joining us at the meeting.

For details on the previous AR/VR for Rhino & Grasshopper meeting you can visit here.


 

AR/VR for Rhino and Grasshopper UK UGM with Heatherwick Studio and Epic Games – Video Recording Transcript

We have made a transcript of the meeting recording, if you’d like to follow that then here it is:

Paul: Right, welcome everybody.  This is the first of our virtual versions of our AR VR User Group Meeting, held here in the UK.  It’s actually the sixth of this type of meeting, but the first one we’ve held virtually.  We’ve met (for this format of meeting) before at AKT II at Grimshaw, at Bryden Wood and Heatherwick Studio Offices previously, and at SOFTROOM as well.

I’m joined by some friends here from… two from Heatherwick Studio, Pablo and Silvia who will be presenting first.

Pablo is the Head of Geometry and Computational Design at Heatherwick Studios.  Silvia is the Lead Designer of the Immersive Cluster at Heatherwick.  So, they’ll be presenting first for 30 minutes or so, and then we’ll be hearing from David Weir-McCall from Epic Games, part of the Enterprise Team in the AEC area.

Just a couple of other things to mention here.  There’s quite a big group joining us.  There might be as many as 300 or so, so please with questions, if you could address them in the questions panel rather than the chat panel, that would be great.  They’re going to be monitored by myself and Steph who is in the background helping out.  So, yes, please add them in questions.  As there is quite a lot of you, there could be potentially quite a lot of questions but we’ll do our best to get as many questions to the presenters as we can.  There is also the chat dialogue opportunity.  You can use that to talk between yourselves, if you want to communicate with anyone else that you know is also participating.

What else is there to say?

We’re having this presentation first from Heatherwick.  There’s a couple of polls that we’ll ask you to complete.  Then we’ll hear from David, then Q and A’s for both presenters, then a round up.  Then after all of this, there is an opportunity to join us on the Mozilla Hubs platform, a fun little meeting, because normally after these things we would have a nice social meet up, some pizza some beer.  We can’t do that this time of course, so we’re going to invite you to come along to the space at Mozilla Hub.  Some details on that will follow after everything.

So, what I’m going to do now is handover to Heatherwick people.  Do you want to just say something as an introduction, Pablo, first?

PABLO: Sure.  I think we’ll jump on the presentation.

PAUL: Okay, I’ll jump out and see you all later.

PABLO: Okay.  Well thank you Paul and Steph for having us here today.  We’ve been part of this AR VR community for some time now and we love always to see what is happening in the rest of the industry and obviously we’re very happy to do something this time around.

So, we are from Heatherwick Studio, and we are a team of problem solvers and designers based in the heart of Kings Cross.  However, during these times, I think we’re mostly working from different parts across the UK, from our own homes.

Today’s presentation is going to focus on the studio presentation and specifically on our Unreal Engine workflow and how we use it for landscape design.

We are going to try to split the presentation in four main chapters.  The first one is going to be covering how we use these visualisations in the studio, and then we’re going to talk about the landscape design and the relationship between this and how we visualise.  Then we are going to jump in to a case study of one of our projects and we’re briefly going to go through future developments.

So as Paul mentioned, my name is Pablo Zamorano.  I am Head of Geometry and Computational Design Department in the studio.  I work across all studio projects and with a team of designers that also are passionate about engaging in complex design challenges and digging deeper in terms of how things get together, from early stages to the very latest ones.  I also work with the great Silvia.

SILVIA: Hello, my name is Silvia.  I am a designer and Immersive Media Specialist at Heatherwick Studio.  I have a background in architecture and interaction design and my focus is to develop and communicate this to the design ideas using Unreal Engine.

PABLO: So, as I mentioned, we are based in London and we try to focus on projects across all different scales and types, and we not only design buildings, but also we design objects, or landscape design, as we will see in today’s lecture.

As I mentioned before, we work across scales and typology at every possible location.  I think our main focus is to find projects that can potentially allow for a positive social impact wherever we are working.  We have a special focus on material, craftsmanship and we are really focused on how things actually feel for people at human scale, at one to one scale.  We like to generally design that you can approach to them with your body and you can feel them and understand them, as positive elements of the human scale.

These are three examples of recently finished buildings, one in Kings Cross, called Coal Drops Yard, the middle one A Thousand Trees in Shanghai and the third one, The Vessel in New York.

We also have a focus on Applied Research and particularly in my department, some of the things that we cover are trying to find the relationship between not only the physical world, but also the digital one.  We try not to get ever too interested in anything we do.  Rather we try to always think very deep into the ideas and how we can develop them further.  So, we use any tool we have at our hands, or not, and if we don’t have the tool at our hands, we try to either look for it, or try and find a way of finding it.  So, really, we try to open our spectrum of interaction between the digital or devices or fabrication methods and craftsmanship.

We apply this through different tools we have put together through the years and we can now run simulations of the buildings, but also allow us to quickly get through every single layer of any complexity and scale that we want to work on.

We are also partnering with different organisations, private and public and some schools like the IAAC in Barcelona, where we’ve been trying to focus on advanced fabrication and using, in this case, the material as wood, and working with simple elements, though trying to use them in a much more complex way and trying to solve, or bridge the gap between complex design and the fabrication materials we use.

So, you can see in this case robotic fabrication, we tried to realise the designs that the students put together in this short workshop.

I really love how robots move, and it sounds very interesting, the potential that we can get from them, we’re also quite interested in the relationship between our body and the machines we work with and we understand what are the limitations of a robot and what are the limitations of our bodies.  So, we try and bridge this gap with tools like Augmented Reality.  For instance, in this case, we are using the HoloLens and mobile devices to put together assemblies that later on we can add to the bigger pieces manufactured by the robots.  So, it’s an overlaying of AR and the physical things, and here are some examples of the final pieces.  So, again, a piece like this obviously is, I would say, too complex for a robot to run through the whole process avoiding clash between the optics in some parts and also too complex for a person to put together without any kind of traditional guidance.  So, I think it’s very interesting how these two worlds merge together allow us to explore geometry further.

This is another example of the use of Augmented Reality in the studio.  This is using Fologram and the HoloLens, this is in our workshop, where we’re using it to wire cut some foam blocks, and you can see how the model in Rhino 3d is moving, following how the physical object is moving.  It’s not unidirectional, but also goes back from the real world back then into the digital world.

Here is how we use it (Fologram), not only as a presentational tool, but rather to understand how digital mock ups can interact with physical ones.  So, we’re testing a small object, a lift button, and then we have another digital option for the same object.  If you look at it on the screen, it may look okay, or you can say whether you like the design or not, but when you look at it with the goggles, you realise that you’re missing with your body, and it’s obvious this object is far too big and we can action these things in real time.  So, the understanding of scale is key when we’re using Augmented Reality.

We’re also teaming up with some other people, in this case, Thornton Tomasetti, the engineering team, and trying to look in how we can customise some of these tools and we had this big ambition to have an AR tool, where we can not only import models that we’ve created but also physical models that we’ve done or even sketches and then turn them in to something else, like running any kind of modelling or sun exposure, or wind for analysis.  And then also maybe editing the geometry from the model devices and then placing them back.

Obviously for this, we did a quick sketch actually over one day.  So, we shortened the spectrum to maybe three areas, so having an import, being able to transform the geometry and then place it back and run some sun exposure analysis and with the key thing in common, that the tool should work cross platform.  So, it’s not an app that only worked for Android devices or IOS devices, but rather is in this case web based platform where you can access on either a mobile device or even your computer.

This is the result.  So, one of the interesting points here is that you can place a geometry base on a tracker, so once as you move the tracker, basically, the geometry will follow, and what you see on the screen on the upper right hand, are the controls of the parameter, basically that we allow the users to control this, the scale, the rotation, but also the day of the year and the hour.  So, you can see not only the object moving but also the environment reacting to it.  Because it’s basically following the tracker, if you had to place it in a different space, you can just literally use your hands and move it along with you and it will follow.  So, we’re very excited to carry on this collaboration with Thornton Tomasetti.

SILVIA: So, understanding that, I’m going to do a bit of a jump and we’re going to talk about landscape and how this is integrated in this design vision.  Basically, we can say that we have nature embedded in most of our projects, from different scale projects, large projects, Toranomon-Azabudai on the left in Tokyo and 1000 Trees on the right in Shanghai as well as medium sized buildings as the residential tower on the left and Little Island in New York on the right, and as well, small buildings, for example, pavilions that we invented with landscape, like the recently integrated Maggies Centre in Leeds.

So, how does Heatherwick Studio see visualisations?  We think it’s critical, that a richness needs to be to applied in most of our visualisations.  So, this means that traditionally, we used to have a series of stills that had a combination of VR, V-Ray rendering and adding some planting in post-production, even some hand sketches, means that it limits our ability to present the design model in a 3D way.  So, it means that landscape becomes something that goes on top of the image at the end and it doesn’t go along with the design process of all the geometry and architecture.  This is very challenging when it comes to projects like Little Island where planting is almost as important as the whole architecture itself.  So, it’s very difficult to imagine how this would be without any planting.  It looks very incomplete and empty.

So, for example, this is how landscape defines and gives context meaning in architecture and geometry.  This is Al Fayah Park in Abu Dhabi where if we look at the architecture elements, it’s very difficult to understand the scale and the sense of the space that we are designing for. But if we implement and we put all of the landscape over it, then we have the whole picture and the idea of what we want to see and what is the main concept of it.  In this case, we use Twinmotion.  This was five years ago, so it was the first 3D animation and planting, all placed together with architecture.  It was great because the client and the consultant’s for the first time really get to see a 360 of both of these things placed together in one animation.  So, before moving forward, describing out what is the visualization workflow, we wanted to give an overview on how this sits in our wider design interoperability workflow, which we developed… basically, we develop all of our projects using different softwares and platforms in different stages.  Rhino is the main design software for all of the stages.  Revit is very important in the later stages and Unreal is the main visualisation tool.  So, in this diagram, we can see which are the highlights of each of the stages, and understanding that for concept and schematic design we use Rhino and then we would visualise it in Unreal and then for the later stages, Revit and we can have a link from Revit to Unreal, for sure we will use it in the later stages.

So, for Landscape Visualisation, we developed our own workflow basically.  Because we are looking forward to seeing the more things then just renders and videos, we are as well researching in 360 views, virtual reality and immersive media.

As part of the studio setting, we evaluate the different ways of visualisation, like visualising landscape.  So, the most traditional ways that are Rhino and then possibly the options in Photoshop or V-Ray for Rhino, and then we are going to compare it to Rhino and Unreal.

So, in the first way, we use Rhino to just render out simple view, then we apply all the vegetation in post-production.  This will give us a fully customised image at the end with a good quality but it can get very time consuming, especially if we want to represent more than one still.  Making changes can get very tricky as you will just enter into a world of Photoshop and layers that is very hard to handle.  As well, we will never be able to create a real time view using this method.  When we use Rhino and V-Ray, we realise that it has an amazing quality and it is very photo realistic results, but we realised as well that because all the polygons of the trees that we want to render is massive, so it will take tons… an astronomical amount of time to render more than a few views around it.  As well, we’ll not have real time alongside the project.

So, after comparing all of these methods, we realise that Unreal is the way to go.  We can see the project in every possible angle.  The landscape is already embedded in the real time modelling.  Everything is fully customised and the most important thing is that apart from customising the look and feel of the project, we as well can customise all of the landscape assets that we want to implement and need.

So, this is the first project that we used Unreal in.  It’s 1000 Trees in Shanghai.  As the name says, it’s a lot of trees and a lot of landscape.  So, we wanted to be very accurate towards it and this is the first video that we did, animation.  This took us around two months to get our hands in.  This is quite a lot of time to do an Unreal model, but from here, we learnt quite a lot of things for in the now future, we just take a few days or a few hours to develop an Unreal model.  So, this was pretty exciting at the beginning as well.  Client was seeing the project years in advance with fully accurate planting and the 360 view and you could see the whole scheme, all placed together in the contextual site.

So, it’s quite nice to compare the reality that is very much in the left, that is almost in completion, and the real that we had years in advance.  So, from this, as I was saying, we learnt a few things.  The first is to create a Heatherwick Studio template that will begin with some materials that we commonly use in our projects and as well, the atmosphere of Unreal would already be adjusted.  Going forward, we built a landscape asset library which apart from having different species of pants in it, the most important is that we have different types of plants.  So, this means we will have conical trees, global trees, quite a lot of types of trees.  As well, this can be subdivided into different categories, depending on where the project is located geographically.  So, this means that if you start a project in California you already have a folder full of the planting that you will use in California and it’s accurate to the surroundings.

The last one is that we improve our workflow with Datasmith and we use Rhino as a main tool of design and it’s quite easy to update any geometry in Unreal.

So, as a conclusion, we were trying back in the day to produce just one Unreal model and this would be a development design and it will be referenced until construction.  What we are trying to do now is to have different real models since concept until construction, which will be evolving and changing all the time, meanwhile we are changing design.  Now for landscape design process, basically landscape design plays a key role in the visualisation in most architectural projects.  Despite being so important, it’s often overlooked and considered only as an interest that is placed in later stages of the visualisation process.

So, basically, to sum up what is our landscape design scope, it can be divided in two main packages.  Hard landscape, that is everything that is manmade structure, so it will be pavement, furniture, landscape structures, and soft landscape, that is everything that is live components, so it will be trees, all planting, land form and water features.  So, we can divide it in four layers starting from the bottom, land form and water, then hard scape and furniture, then lower planting and tree planting.  We can conclude that really the true heart of landscape, is the two top layers, that is the tree planting and the lower planting.  So, we can see here the comparison on the left is just the first two layers, and in the right, we can see the third and fourth layer, which gives the very characteristic aspect of what is landscape.  So, this means that we should pay attention and get the whole picture and make it accurate if this is the most seen and most important part of landscape.

So, this is normally what we will deliver in our package landscape from Revit.  So, you can see it is very dry, it has a very traditional method of communicating all of the design information.  So, we have in the right technical drawing, defining areas of planting colours, each of them colour coded to differentiate different hatches that we schedule each planting in the left, and we describe a specific species of plant.  This sounds very crazy.  Apart from that, we will combine this with a reference of images that will define what we are saying in the schedule, and from there, there is a massive jump between what is the technical drawing and what the landscape architect asks and what is the deliverable images.  So, using Rhino and V-Ray and Photoshop, there is a lot of artistic interpretation in what we are seeing and what really was delivered, making us a lot of manipulation in each image and we are lacking a lot of fidelity and accuracy towards what we are describing.

So, to conclude, we can see that the landscape design is received as very technical and often detached from the project.  It’s very hard to visualise properly landscape as it is very time consuming and normally the quality is not that good.  We have a clear need to make planting design and development more inclusive and interactive.  That means that we wish to involve more the design team and the clients in the landscape design decisions.  So, for this we use Unreal and we think that this is the best way to visualise landscape.  It’s very dynamic.  It’s a dynamic tool that will allow everybody to comment and to see the outcome immediately.  Planting is accurate and you get a holistic view of an integral project of landscape and architecture.

So, we have this case study Tokyo Japan, where there is a massive mixed use development which includes residential and retail, office and education use and it has an extensive area of landscape.  We are trying to create intimate human scale gardens and as well, a large scale city.  As you can see in the plan, it’s full of landscape.  What we tried to do was generate different narrative and characteristics of the landscape.  So, in the entrance, we wanted to generate gateway plazas with cherry blossoms, to invite everyone to go in.  In the middle, we wanted to create urban orchards so that people could interact with all the landscape and as well, woodland grasses.

But we are just going to focus and speak in the central garden.  So, this is an enlarged plan of the planting scheme.  So, basically, we have different mixes of shrubs and grasses and these are the description of each mix.  Each one will have different species that will describe colour, texture, height, if they are evergreen and so on.  For example, we take shrub mix 01, which will have a warm colour, red, orange.  We already know the heights of all of the planting, so what we want to do is bespoke this in Unreal to make it as accurate as we want.  So, by loading a lot of real assets in to our project, then we can edit the colours of any of the flowers.  We can remove some leaves and begin to create seasonal changes if we want, as well as we’re seeing at the top, we can have a cherry blossom tree and then just by changing the colours of the leaves, it becomes another type of tree.  As well, we can combine two assets, so if we have ivy and we combine it with some flowers, it can become a beautiful jasmine flower that we can hang anywhere that we want.  So, from doing this, we recreate all of the planting that the schedule was showing us.  As well, for the grasses, what we did was duplicate one of the simple shrubs that we had in Unreal and create a new static mesh, and for herbaceous ornamentals as well, there is a combination of grasses and flowers.

So, when we have already created all of the assets, what we need to do is create some folders in Unreal that will have all of these static mesh inside, or all of the plantings and then we import just the surfaces that are already colour coded with the folders, so that it means that anybody can jump in to Unreal and not be a landscape expert and already can plant and make any design decisions towards this.

So, this is explained here in a small video, where we have the surfaces with the colours, the folders, they are matching the surfaces.  Then we choose the plantings.  We drag and drop in the folder to Unreal.  We select which ones we want to use and then from there, we change the density that means how much planting you want to paint when you want to paint it in Unreal basically.

So, the density, this is a bit of trial and error, and then you just… with one click, you will immediately have all of the landscape in the surface that you choose to plant it.  So, then you will be exactly the same for the next one, you unclick the ones that you had and then you just click the new mix and you will paint it as well.  That’s how we began to create all of the lower layer planting.

As well, we do it for the grasses.  That is a… it’s very difficult to get accuracy in grasses, but in Unreal we can manage to do that.  At the end we just hide the surfaces that we imported, we turn on the soft scape and we can meander inside of the project and literally see if it looks good or not.  Sometimes, you just don’t have a say in these situations and it’s super nice to design and see if you can change some of the design things that you already thought about.  Sometimes you just need to do that right.  You can walk around and have a real time 360 view of all of the corners of the project and decide on all of the landscape that you need.

In summary, you will have a blank model for the colour coded areas, place the lower planting then import the bubble trees and then you will change them for real trees.

This means that we will have a fully coordinated model in Unreal, giving us the most accurate planting that we can have.  We will have more than one final landscape version so that we are encouraging people to mix and test different types of versions to see which is the best combination for your project.  So, it means that Unreal will allow us to interrogate landscape design in a more holistic way, involve all of the teams and consultants in landscape design progress and decisions and the most important is that we will be spreading the knowledge of landscape design all around architectural domains.

PABLO:  Following from what Silvia just showed us, we are looking in to future developments and some of the things that we really care about are these workflows.  How do we come up with the easiest ways of communicating the different softwares we use?

So, in this diagram that Silvia showed before, these three main softwares we use for design being Rhino, the key backbone of the process all through, from early on till later status.  So, how can we connect these three softwares together, is something that we’re always asking ourselves and trying to improve.

So, about a year ago or so, we teamed up with Mindesk to answer this very question. So, how can we make this connection between the software as smooth as possible, knowing that Mindesk already had some of this potential built in, so they were using an Unreal Engine to turn the Rhino environment in to a virtual reality one, and we thought well maybe, if these two softwares are already talking to each other, there must be a way now to just progress that in to a link where we can have both environments at the same time.

Here you can see some of the results of this and there is… this is now live and is a tool which is part of what Mindesk offers as part of the software.  So, if you modify the geometry in Rhino, on one side, then Unreal will automatically show you what changes are happening.  This means there is a direct pipeline between the two worlds and so any geometry that you move is going to move across, any geometry that you import is going to get imported across live.  If you modify it, it’s going to get modified.  If you hide it, it’s going to hide on the other side as well.  So, it’s going to make this process of turning these lollipop trees in to real or more realistic and real assets much easier.  So, here you can see how things are moving live.

And another thing that is very interesting is that most of the people that engage with Rhino in the studio, don’t necessarily understand how Unreal works.  So, here there is a view link where you can actually control the viewport in a real directive from Rhino.  So, if you don’t understand how to navigate to the Unreal world, you can still modify your views from the Rhino viewpoint and then Unreal is going to mimic that.

Very recently, the tool has been evolving and you can now automatically modify the assets that are going to be assigned to this.  Unfortunately, we don’t have a video showing this because it’s come up quite recently and we cannot record it.  But imagine these lollipop buildings in Rhino will automatically get converted in to more realistic versions based on our landscape template.  Of course, this is not only related to what happened in traditional Rhino geometry, but also can be driven by Grasshopper geometry.  So, this obviously has a huge potential for us, because you can quickly test different options, but also to animate things in Grasshopper very quickly and see how things move up or change in the Unreal world.

Also, as Silvia mentioned, we engaged with Twinmotion something like five years ago.  Twinmotion is the very reason why we started investigating realtime rendering in the studio and we are also excited about the new developments of this software and we are currently testing it and super excited to hear possibly more about what is coming up with this in the next lecture by David.  But we really like the ease of use of Twinmotion and we love the fact we can control very quickly the season and make it change.  Now, one of the things that is quite key for us, is to be able to modify the planting assets and to kind of customise with our own planting, based on our own landscape design.  So, this is something we’re hoping Twinmotion could introduce in the future, so David, please take note of this.

I think with this, we can finish our presentation.  So, thank you so much and we will be answering your question.  Thanks.

Paul: Okay, great, thank you so much Pablo and Silvia.  Okay, I have some questions for you.  I’ll start with some workflow questions I think, first.  We’ll just start with this question from James.  Have you started to use Rhino Inside Revit?  Has it been useful for you or have you found it not developed enough for you yet?

PABLO: I guess this is for me.  The answer is yes.  So, we’ve been using Rhino Inside for quite some time at the studio already.  We already present our internal templates that focus on different customised workflows that we always use.  It’s been an interesting journey because Rhino Inside obviously it’s under development.  It means that one build may change from the previous one.  Any attempt to standardise something or a process, may be in need of reviewing with the next release of the Rhino Inside version.  But we’re still very excited and actively using it in very large projects actually that are now gearing up for construction. So, it is a very powerful tool and it’s not only for Revit.  I’m also hoping to hear more from someone who is using it a bit more with Unreal.

The other very good thing about Rhino Inside is it allows you to hack it.  So, we have some coding… if the tool is not doing it for you, you can actually come up with a tool that will do it.

Paul: Okay, I’m just going to go down these questions in any order now.  Are you looking at procedural generation of plants?  I guess a question for Silvia?

SILVIA: We have seen it by literally creating the trees by scratch but we have developed a better and faster workflow by using already the assets that Unreal gives us.  So, it is literally a combination of all the libraries that are out there that people are using and making it our own.

Paul: Something else here from Kevin.  At the point of workflow development that Silvia was talking about… I’m just reading this out, so when you say Unreal, are you building this library and templates for Unreal directly, or a library for Twinmotion?

SILVIA: It’s for Unreal.  We have it already, the foliage in Unreal and it’s just for Unreal.

Paul: Thank you.  Question from Lynne.  Can the assets library be shared across teams and projects so everyone has the same and updated assets all the time?

SILVIA: Correct.  We start with a template and then we will be narrowing down where the project is located and then we will give the template with already the library of plants that we think we can be using there.  If not, we will add more plants if needed to that template, depending on the project.

Paul: Okay, thank you Silvia.  Right, do you use VR with clients and/or collaborate within Unreal?  Question from Martin Johnson.

SILVIA: Not at the moment.  We are the owners of the geometry and Unreal itself.  We don’t share Unreal but we share its animations.

PABLO: So, to the point of VR with clients, I think the answer is yes, sometimes.  In some of the projects we’ve been working on, the clients actually turned out to be quite sophisticated and they have asked directly for either walkthroughs or a 360 animation where they can use their own VR domes to walkthrough the project and similar things.  So, yes.

Paul: Another workflow question.  Maybe David can answer this, or maybe you’d know as well Pablo.  Does Unreal have a Revit plug-in and how does the Revit, Rhino, Unreal workflow work?  Just before you answer that, it’s probably a good time to actually just mention that we do have a sponsor for this meeting, and that’s the developers of BEAM, which is a solution for interoperability between Rhino and Revit.  Anyway, I know Pablo has used BEAM and may be an advocate for that solution as part of a toolset, but anyway, did you get that question Pablo?

PABLO:  Yes.  So, we haven’t connect Unreal and Revit directly yet.  That’s the quick answer, but there are ways to doing it.  I know there is some development using Rhino Inside to do this, and also I know that Mindesk… Gabriella may be somewhere in the audience tonight, I hope.  So, hopefully after this, in Mozilla Hubs, you can try to find me and ask directly, but I can tell you that some very good news about this may be coming your way from Mindesk.

Paul: Very good.

DAVID:  I’ll elaborate in the next presentation.

Paul:  Quite a number of questions here.  I don’t think we’re going to get through them all but let’s see.  Right, so Pablo, have you tried Unity?  How would you compare Unreal versus Unity?  That’s a great one in terms of ARVR interface, software interface or within your workflow?

PABLO:  We have tried many different platforms and I think the answer of why we selected Unreal is because of the quality we can achieve with it.  So, I know some of other like Unity for instance are closer to the larger hacker community.  People want to dig deeper in to these rabbit holes in software.  But to be honest, what we really care about is the actual quality we can get from a software at the end of the day.  For us, the best results so far have been coming from Unreal, and this is why we engage this.  But yes, we always try any possible solution that is out there in terms of the ease of use and the quality.

Paul: Great, thank you.  Silvia, are you using Speedtree to create the vegetation or are you using a library of some sort?

SILVIA:  So, as I said before, we are using just the Unreal libraries from Epic Games.

Paul: Fine, okay, could you talk a little bit about how you onboard clients in to using and viewing the work in Unreal and generally in ARVR, and how receptive they’ve been to this?  It’s a question from Pam Harris.

SILVIA:  So, basically, it’s a win/win situation.  They love it.  They are very engaged with it.  They are always asking or wanting Unreal.  It’s the best way for them to see and understand all of the architectural terms and everything we are talking about and when we engage them with Unreal they get the whole picture and it’s very clear.  The next steps that we need to develop are super clear.  Sometimes it’s very tricky because we have a lot of things to finish because you can see everything, but I think it’s very useful. I don’t know if Pablo, you can say anything else about it?

PABLO:  No, I think that says it all.  We use it for every single presentation with the client.

Paul: Very good.  I think one more question from Libney.  Is it possible to upload the Unreal scene to the Cloud, so the client can check in a web browser?

PABLO:  I think that’s a question for David I think, but there are many ways of actually doing this and I know you can do this in Unreal and export in many different platforms, not only web based but also augmented reality models and so on.

Paul:  Okay, so maybe that’s something we can come back to.  Okay, I think maybe… I’m just going to ask you one last question, and then we’re going to go to David’s presentation.  But if there is time, if we’ve missed some questions and you really want anything, if there is something really pressing, please do let us know that it’s an urgent question and we’ll make sure that we get to it at the end.  I’m trying not to leave people out but it’s a bit of a challenge.  Okay, let’s see.  What are you using for version control, mentioning Per Force SVN or something else?  That question doesn’t mean much to me.  Do you understand the question?

SILVIA: No, I didn’t, sorry.

PABLO:  So, how are we dealing with different versions of Unreal?

Paul: Yes, I guess so.  They are talking about asset version control.

DAVID: It’s for multiple users to engage in the same Unreal.  So, are you using it as across the office for multiple users on the same scene, or are you using it for individual users for individual scenes?

PABLO:  We are using it for individual user for individual scene.

SILVIA:  Correct, but again, we also began to use levels so everyone can jump in to the level and change anything that they need to change in Unreal.  Like somebody will have the core Unreal model.

Paul: This is the last question before we go to David.  Do you do all the lighting in Unreal or do you add additional surfaces to be used as light sources in Rhino for instance?

SILVIA:  The lights, we use them all in Unreal, yes.

Paul: Fantastic.  Thank you Pablo, thank you Silvia.  We’ll see you again at the end.  We’re going to hand over now to David.  Thank you very much.

Okay so you’ll be made presenter now David, and I’ll say see you later.

DAVID:  Okay, you should be able to see my lovely background wallpaper.  First of all, huge thank you to the guys at Heatherwick Studios.  They just did a great job of showing you kind of what I want to share with you as well, which is some of the great use cases which are coming out of the architectural engineering construction industry.  So, I’ll quickly start my presentation, but before we get started, I just wanted to share a quick introduction for those of you who maybe aren’t aware.  I work within the architecture engineering industry within Epic Games, which basically means that I focus on in speaking to architectural engineering construction firms about their use of real time rendering tools in the workflow.  There are ways to innovate work processes and outcomes.  And what we do is we go around and we talk to a lot of people about a lot of uses of Unreal, the different uses of Twinmotion and the big thing that we usually do in these presentations is we like to share the use cases behind it, and Heatherwick is just one of those great use cases in landscape design and how their workflow has come out.

I want to share another couple of examples with you today, but first of all, just in case there is anyone on the call who is a little bit unsure of the Unreal Engine or Twinmotion, I just want to spend a couple of minutes just quickly running through that for everyone’s benefit.

First of all, what are we?  Well we’re Epic Games, and we have this great platform called the Unreal Engine.  Now this is what is used to create a number of the games that you may recognise, that big one up there Fortnite, which I recently joined a week ago, and I get my ass kicked by nine year olds on a daily basis now.  But it’s also used as a background game engine for Infinity Play, Gears of War, and we also license out to both the games sector and the non-games sector.  So, other game studios are using Unreal Engine as their tool.  The non-games is where I sit and a number of other great guys sit as well, the film and media and the broadcasting and the automotive and manufacturing.  If anyone is a fan of Star Wars, The Mandalorian was filmed or used Unreal Technology which is exciting.

But our big thing which I think again, Heatherwick did a great job of which is why we’re in the AEC space, is this ability to bridge ideas and reality together. Within the architecture engineering construction community, our output isn’t the same as the games or the film.  We create reality, real building, and reconstruct those from ideas.  So, we see these real time render tools as the fastest possible way to share and engage stakeholders and we see outputs of those ideas.  Again, some of which we’re going to see a little bit later on.

But again, if you’re unfamiliar, we have two lovely products.  We have Twinmotion, and we have Unreal.  They have both got their different use cases and I just want to define exactly what they are so that you can understand when we talk about things moving forward, where the use cases for each of them sit.  The way that I usually describe Twinmotion is this idea that it’s architectural visualisations in a few clicks.  Essentially what Twinmotion is, it’s the Unreal Engine but with this wrapper around it, which has been customised and set up for ease of use for a very quick and simple learning curve. So that you can create great visuals and within a few clicks as I said, not one click but a few clicks.  It’s the comparable tool to what we see people doing with Enscape and Lumion.  It’s just there for the everyman, the every architect and engineer can have this and work alongside the propriety tools, whereas what we… here is a quick video demoing it.  What we really have in here is what Heatherwick alluded to and it’s a number of key things including how it speaks to Rhino, the way that you can use the assets whenever they are in the Engine.  We have a thing that we like to call Smart Assets, trees that interact with environment and people and animations and cars and things like x-ray materials for engineers to be able to see their designs.  The other great thing we see about Twinmotion is this asset library of about 2500 assets.

What’s coming very shortly, because Heatherwick were asking about road maps, is, these assets are about to be released on the Unreal Engine Marketplace.  So, these assets aren’t just fixed to Twinmotion but they can actually then be used in the Unreal Engine as well.  So, we are really excited about that.

So, the Unreal Engine then is very different, how we describe it is very different.  We see it as an advanced 3D real time creation platform.  So, it’s the place that you take your visualisations and you advance them to the next level.  There is different use cases that we see people doing this in, but really it’s about that engagement, that virtual reality, augmented reality, creating these outputs, these UI configurators that give you an extra element of control.  So, it’s not just for visualisations, it’s for going beyond that.  We see it being used across the industry in a variety of different ways, and I’m excited to get to share a number of those which you’re seeing on the screen in front of you today, and tell you a bit more about they’re using the Engine.

But in terms of both these softwares, I guess the things I really want to cover which may pertain to you guys a lot because I saw a lot of the questions focusing on this, why people use the engine and what the engine has to offer, both Twinmotion and Unreal.  I usually sum it down to these four things.  But today I’m only going to talk about one of them in more detail.  But we have this great ecosystem.  We acquired Quixel which is the amazing material library of high quality materials and assets which now syncs seamlessly in to Unreal and Twinmotion.

Data aggregation is working with your proprietary software tools like Rhino and Grasshopper, and we have plug ins that are free, fresh out the box and optimised for these platforms, ready to use from the second you open up the platform.  Same with collaboration.  From day one of opening up Unreal, there’s a template which allows you to create an experience that you can share with other people, and have multiple people exploring the same space at the same time even across the world.

Then the last one is, the assets that you have, once you have these visual assets in 3D, we want it to be as open and flexible to do what you want with it as possible, as in, you could have it as a render today.  You could have it as an animation tomorrow.  But if in a weeks time you want to turn it in to a virtual reality experience, or a web based application, or a desktop game, then those options are all available.  So, you can customise the experience around your use.

But the big thing I want to cover, because I just heard a lot of it in the Q and A is, this idea of data aggregation and how it is used with these external tools, with a special focus today on the Rhino and Grasshopper side.  But as I said earlier on, we have this built in tool in Unreal that we like to call Datasmith, and Datasmith’s job is to convert these external assets in to Unreal assets, in a very non-destructive way.  It turns Rhino assets in to Unreal Assets.  It turns Revit assets in to Unreal assets and it’s just meant to be this very quick way that you can then also optimise with automation, with a thing inside Datasmith called visual data prep where you can pre-customise essentially a script which runs every time you import your model, that will do all the prep work ahead of time.  You have a material that you know you have a nicer material of in Unreal, but you can get it to automatically replace it.  With Heatherwick, they had obviously trees that they wanted to replace or they wanted to put in their place, you can get it to take all these trees from the Rhino model, and replace them with these lovely trees in our Unreal asset library.  So, that’s what Datasmith is there to do and it’s constantly advancing to be more and more real time.

And it’s not just us.  We are huge supporters of makings sure that our platforms speak broadly across the AEC.  I think there is this common understanding and everyone on the call will be aligned to it.  We have a big interoperability problem within the AEC, or opportunities, in ways that we work and a variety of different tools.  We need them to speak to each other better, and some of the projects and some of the tools we’re seeing emerging by people within the AEC to answer that call, are really exciting.  Tools like Speckle and the BHOM by Buro Happold the work which is being done within Rhino Inside (from McNeel) and also Mindesk.

I just want to quickly touch on these for those of you who are unaware.  First of all, big news I think last week, that we now have an official Rhino exporter in to Datasmith.  Before it’s always just been an in-built thing in to the FBX, but now it’s been optimised for the tool, so we’re really excited.

But Speckle and BHOM are the two big ones which we see a lot of potential and development in.  If you don’t know what Speckle Works is, Speckle is this amazing open source data platform, which is looking to answer the problem of these multi-programmes or workflows, in a way that is a cloud based system, that will share data and geometry across platforms that you can have Revit assets that you can then see in Rhino in real time, that whenever you make a change in Rhino, you can see it in Revit.  It’s this interconnected system that both the Speckle and BHOM are looking at addressing, and integration with Unreal has been explored with HOK and Mobius Node.  We are supporters of people exploring this space of interoperability.  We gave them a mega grant and they’re working on developing this tool to create this more integrated system through Speckle and making Unreal part of that equation.

Similarly, with the great work that is going on with Rhino Inside, Rhino in a general is a C Sharp tool, which is preferably Unity based.  But for those of you who are using Unreal, there is this external wrapper that has now been developed called U Sharp, which allows a C Sharp interface in to Unreal, allowing you to use Rhino Inside, within the Unreal Engine.

So, there’s different tiers of expertise you need for each of these, but it’s really interesting to see how people are rising up and addressing this interoperability pipeline workflow.

The last one which again got a lot of focus on in the previous presentation is Mindesk.  It has to be the simplest of them all, of creating this real time, synced, bi-directional workflow of Rhino and Grasshopper and Unreal with the added bonus of having this great interface that allows you to model and work in a virtual environment and see that automatically changing in Rhino and Grasshopper.

So, in general, we’re really excited about the way the industry is approaching this problem of data aggregation and bringing… not just for the Unreal Engine, but for the entire industry, and we’re happy to be a part of that.

And the last thing, just before I jump on to seeing some cool stuff, is this idea that we have this great tool called Twinmotion and we have this great tool called the Unreal Engine and what we’re really excited about is this process that you’re very soon… we just released the Beta of it, where you’re going to be able to export models out of Twinmotion and import them in to the Unreal Engine, which basically means you can create this very quick and beautiful architectural visualisation using Twinmotion, very quickly, simply.  Throw in some lights, some assets, and then export that entire thing in to the Unreal Engine to then add that next layer.  So, beyond visualisations into something else.  This is for the digital specialist, the visualisation experts or UI creators that they can add that layer on top.  So, we see this streamlined workflow that can be used across the process, from architects to engineers, all the way through to the technology specialists.  So, we’re really excited about the development of that.

So, I want to spend a little bit of time talking about this which is really the many different ways that we’re seeing people use the Unreal Engine.  It’s talked a lot… we see a lot of people using it for architectural visualisations and VR, but we sit it being used across a broad range of different areas throughout the building information life cycle, from concept design all the way through to the building and operations side and that’s kind of what I want to share with you today, is people who are using it across a variety of different areas, digital twins to training to visual communication and virtual collaboration and share with you some of the ways they’re doing that.  Again, it’s because of these great data aggregation tools that are out there, that allows Unreal to really come out and shine, and that’s always part of these processes.  So, I’ll share a number of them with you today.

What’s really funny about the way the tool is developed is, we find ourselves speaking less and less about the visual fidelity of Twinmotion and Unreal, mostly because we feel that it’s a given.  The visual quality that you can get out of these tools, this is Twinmotion, it comes… it’s just known or it speaks for itself.  So, we find ourselves spending less and less time talking about this aspect of the visual quality and fidelity you can get out of Twinmotion or alternatively, what you can get out of Unreal Engine and finding ourselves focusing more on the UI and the use cases around pushing it beyond that.  In saying that, for anyone who is really interested in the arch-vis side, I just wanted to throw these two examples out there just to interest you.

We recently worked with the Mill on creating videoes released in Unreal Fest, just about the different use cases or different industries using Unreal Engine and what’s special about this collaboration work we did with The Mill, are these effects that you’re seeing, the construction of the buildings, the ripple effect, the Inception/Dr Strange – esque style creation.  We actually have a webinar which shows you how we created it.  Similarly, with this example here, it’s using real time retracing.  We have that asset for free on the Unreal marketplace.  You can go in and see exactly how it is that they were set up, the assets that were used, the light settings, to try and help you understand how these visual qualities were developed and achieved.  Like I say, we rarely find ourselves focusing on these any more.

We find ourselves more focused on these amazingly fun areas of immersive design.   People are using it in the early stages or using it in the more immersive sense from people like AF Consult who are using VR as a way of designing with the client.  So, this is a GIS Map that has been brought in and they’re using it as a way to draw a map of potential transportation, road, rail links through that 3D geometry, but in VR in a multi user.  So, there are other people able to be in there at the same time to share and understand it.  All the way through to the work that I was part of, which was for the work of CallinsonRTKL where, we have people in virtual spaces all the time where what they’ve done is, they’ve built this tracking tool where they can see where people go in these virtual spaces, what it is that they look at, what objects it is that they’re hitting.

So, they can better define what areas are important, where people are not going, what are people looking at that they find interesting and then planning and designing around that.  Is there an area that people aren’t interested in?  Is there something we can do in that area to grab attention and grab focus and that’s really a great use case of what is coming out of the engine, all the way through to again, although we saw Speckle and the BHOM, there is a bunch of other data integration pipelines that focus more on the parametric building design.  This is a thing that was designed by Cornell University that works with City Engine and bringing in the archaicism, the parametric control that City Engine has and exposing those functions within the Unreal Engine.

So, lots of really cool use cases of what we’re seeing in the immersive design side, but I would say the biggest use and most beneficial use which we see, which again, Heatherwick touched on, is this idea to communicate ideas and to communicate information and vision, be it just in an immersive space, or be it by creating custom asset tools.  This is building asset management tool created by Cityscape, that was created for leasing managers to speak to future tenants, so they can explore the building footprint and see it in a one to one virtual scale.  But also linking that model, and linking that Unreal model to a financial model that what they’re then able to do is draw out new floor plates and that will then update in real time what the cost of that space might be.  This kind of integration in to real world date and giving context to that information is really what we’re seeing a powerful use of the tool being and then obviously exploring it, that new space and the engine, all the way through to the work that Arup are doing.  They have this amazing driving simulation game where again, because it was all created in Unreal in this 3D asset, they created this customisable driving simulator tool that was a fly through and a real time walkthrough, which they actually hooked up to an entire driving simulation game.

So, it’s just great to see the different ways that people are communicating and engaging with people around these ideas, all the way through to the work they’ve been doing with Accucities.  This is the newest tool planned city her they have a digital twin of London and a variety of different cities throughout the UK, and the way that their model links in with City Data to help future city planners see their future buildings and city data in context, it’s really exciting news.  The new tool planned city looks at integrating in with your actual models, your geometry and your information.  Whether it be a simple massing model, that you can then run a bunch of simulation tools on within the application, like visibility site lines, like it’s doing here, where your buildings are from, all the way through to if you want to see it more clearly, importing your own building design in here, and then running similar of the application simulations in that.

So, the communications side is a huge area, but I think the most important and most relevant to today’s climate is this idea of virtual collaboration, and this actually addresses one of the questions that was brought up by Paul at the end of the last session, which we will get to in just a second, but this idea of now that we’re not able to be together, being able to virtually meet in spaces and still communicate and explore ideas.  The Unreal Engine has the collaboration template which allows multiple users to be in a space like this.  Thea, the company that you see in front of you, they built on top of it.  They were like, this is great, we’ll add more functionality in to it now, allowing deeper control and integration of your ideas, to be able to share that in a virtual way, in VR to sketch and to annotate and to communicate, both in VR and in desktop, which I think is really exciting.  There are beta’s currently online and it’s called Big Room if you’re interested, and I can provide links to all of that afterwards.

But I guess the big thing that is really important and really cool in this day and age, is there is no longer the need to have big powerful gaming computers in order to view the content, and the example I want to share is this example by a company called Pure Web, who use a new feature of the Engine, called Pixel Streaming, or their own version of it, where you can load your executable, Unreal instance on to a server and then share that in a web based application.  So, this is running on Google Chrome where all the functionality of the experience, the sales configurator, the quality and the fidelity, is all running on a browser and through a web link that you can just share, and you can do this fresh out the box, using this tool called Pixel Streaming.  There are lovely tutorials online about it.  You can then look to post it on your own servers to much larger audiences, using AWS instances, or it gets a little bit more technical which is why things like Pure Web exist.  They are there to work with you, that you give them the visual content that you want and they’ll control the back end, the servers, the hosting, the graphics so that you can then go around and share this content.  So, lots of different avenues to explore.

The last one which is actually fresh off the press actually, it was something that happened about a week ago, which is people hosting virtual events and talks.  This is again, a screen grab to show this was on a web browser of the latest… a world digital built environment 2020.  It was created in Unreal and hosted and allowed users to explore this virtual space created in the Engine and witness talks, virtual production techniques like green screen included in to the platform so you can view it in the browser, navigate and walk around and be able to experience presentations in a much more one to one basis.  We are seeing a lot more of this emerge which is really exciting.

The last one again is just other people are looking at these virtual collaboration tools.  So, we have a free one, ESRI have created a free one for the Engine as well.  Thea have worked on a much bigger one.  I think Space Form by Squint/Opera are doing again a much higher level professional one, so lots of different options to explore and use the space, whether or not you want to create it yourself or whether or not you want to use one of these great presets that are in front of us.

So, that’s virtual collaboration.  A lot of our conversations seem to be focused on that at the moment for obvious reasons and although this may not pertain much to the work that you guys do, I always keep it in because I love the use cases and the different ways that people are using the Engine.  So, this we’ll commonly see being used by our real estate and their building operator use cases, like Aedas Homes who… like the tools we just saw, they mixed virtual production with the Unreal Engine that they’re able to bring in their 3D assets of a building, but the woman you’re seeing on the screen isn’t a recording, it’s a live feed to their studio and you can ask her questions.  Does this lovely couch come in a mahogany green?  Yes it does.  Then you can change it.  So, it’s kind of just bridging this ability.  When we can’t be together, what’s the best way to engage with people and talk about big ideas?  All the way through to the work that Line Creative have recently done.  We’ve all seen AR before, this has to be one of the more high quality ones I’ve ever seen being done, just the level of quality they’ve managed to retain from the Engine in to these AR applications, and the tools around this when the Engine are constantly getting better, which basically means you can have these AR models out in the middle of a city and not need a representation like a sheet of paper in order to load the model up, which is super exciting.

Then you have Imerza.  This is an awesome one, mainly because it’s a 3D printed model.  This is a 3D printed model that they use 12 laser projectors to project on the image on the Unreal Engine.  So, they’re able to actually communicate with large audiences and large groups of people around a fixed master plan.  Probably the closest to masterplans I’ve ever seen but it links in live with the Engine.  It’s real time, so that view you’re seeing is from the Engine.  You can change it and move it around and it updates on the model itself as well, which is what we really like to see.

For those of you who are looking ahead to the future trends, we’re having more and more conversations around this idea of digital twins and smart cities, which is really exciting.  We’re so glad to be a part of it, because we see a lot of digital twin solutions going around where people are bringing in geometry and trying to represent sensor information so it’s understandable, and this is where I feel gaming engine technology comes in to its own.  The ability to put sensor information in to context in a controllable UI format is really how you get understanding of that data, and how that becomes information.

This is a company called 51 World who have built an entire business around working with clients and developers on bringing their physical assets in to a digital UI.  So, they can control them, they can visualise live sensors in the Engine, see how they’re performing and interrogate it from a very small scale, like a floor to floor kind of building, all the way up to the much larger sizes and scales.  So, they’ve done it right up to these bigger building things, all the way through to… I think we released a video online a couple of weeks ago now, how they had done it to an entire city.  It’s massive.  So, yes, they’re building digital twins for cities, which is really exciting to see what they’re going to do with it next.

I realise we’re running out of time and I just want to leave it open just a little bit.  I’ll jump in to our last one which is scale and size as open worlds ability.  Talking about data aggregation of Rhino and Grasshopper, the ability to work with those tools including things like LIDAR scanning technologies.  So, we can now import LIDAR scans.  This is run by a company called Virtual Wonders, and the model you see in front of you, first of all, it’s real time and you can walk around it in a VR headset, real time, but that’s points that you’re seeing.  That’s not measures, or points converted to measures.  The point density is so tight in this that it actually appears as an object.  So, as scanning technology increases, the abilities of what the real time renders Engines can handle, is also increasing.  So, a lot of people are pushing the boundaries on this.  So, all the way up to what Build Media have done which is build the entire Wellington City in New Zealand in Unreal.  It’s not just Wellington, it’s the entire country they’ve put in, using a very streamline process of GIS data, photo scans and BIM information on to this refined workflow that they have this entire running.  This isn’t a video animation, this isn’t all just special effects and cutting especially.  We’ve actually kept in the views from the Engine which shows it running at 50 frames per second rate as you go through this model.  What is great about the one platform, many assets, they’re now looking at working with Wellington City Council and integrating IOT information to turn it in to a digital twin.  So, the future of where your models can go are amazing.

That’s kind of everything I want to talk about.  I realise I’m coming up to my half hour slot.  The last thing I want to talk about is really the importance of this slide which we spoke about at the beginning which… if you’re not at the Unreal side of it and you’re working in design tools like Rhino & Grasshopper, Twinmotion is a great start for you.  The Unreal Engine is a great adopter, but there’s a slightly steeper learning curve and that’s where you rely on your digital specialist team if you have them, or if you’re very technically savvy and good with the science and technology.  It’s great to jump in because the potentials are limitless in what you can do.  But the whole idea is that you can work with one and translate in to the other.  It’s really the future where real time rendering technology needs to go.

And the good news, the great news, is Unreal Engine is free, completely.  It’s open for anyone to use.  I love this fact.  Unless you’re creating a game that you’re going to be releasing on the PS4 or XBOX generally, we don’t want to hear from you as far as licensing goes.  If you’re releasing a PS4 title of the newest building then maybe we should have a conversation, but in general, use it, develop it for what you need and we don’t ask for anything in return.  It’s there to use.  It’s an open source tool and you can add to it and build on it in the marketplace, and that’s what I love about it.

The Twinmotion side, it’s not free.  It’s a paid asset.  It’s free to try, but I guess the good news which was brought to my attention today actually, is that Twinmotion is free to Rhino users.  We have a new tie in which we’re doing for a limited time, so any Rhino 5 or 6 owners can get a perpetual license.  It’s not a subscription.  We don’t do the subscription thing with them.  Twin Motion, you have the software and it’s yours forever essentially, absolutely free.  So, I’ll make sure the link is in the chat at the end of this.  All of your information is there and how to use it, and with that version that you get, you’ll get all the updates up until the end of 2021, which again is great news for all.  So, that’s everything I want to talk about.

The last thing which I kind of spoke about very quickly earlier on is this idea of MegaGrants.  We’re very big in supporting innovation in the industry in a whole.  You saw earlier on, we’re supporting HOK with Speckle, a number of the people you maybe have seen in the presentations are also recipients of this.  It’s a pot of $100 million in grants that we allocated to drive towards developers to essentially create amazing things using these tools.  So, if you have a great idea, if you’ve got something that you really want to experiment and try, we encourage you to create a proposal and send it in to our Epic MegaGrants team and again, you could be lucky and receive a grant that allows us to go in to those developments around integration and interoperability, like Speckle.  So, we’re really excited about exploring that.  We’ve seen some great stuff come out of it and we encourage everyone to have a think about how you can push the bounds on solving these large scale problems.

Lastly, I can’t actually answer anything on this, because it’s currently heavily under development but I’m sure everyone saw the Unreal Engine 5 which is coming out in 2021 and this is the future of where the Engines and real time Engines are going.  So, if you haven’t seen any of the footage on it, go and have a look and this is what we’re walking towards and the Unreal Engine 4 is a great place to start because the integration and switch over will be seamless, everything will work and translate across but with all the great benefits.

That’s me.  I think I’m bang on my half hour.  I’ve done this a lot so I think I’ve timed it pretty well.  So, I’ll stop my screenshare and go back to Paul and open up for some questions.

Paul:  Great, thank you David.  It would be excellent if Silvia and Pablo could join us again as well, because there are some other questions that have gone back to them as well.  Brilliant, thank you Silvia, thanks David.  Excellent, loved the announcement about free licenses of Twinmotion for users of Rhino 5 and Rhino 6.

DAVID: It’s awesome.

Paul:  Let me put the link in the chat so everyone can have a look at it and explore it, and yes we can jump to questions.

I’m going to ask this question here because it was marked as an urgent question but I think it’s a question for you Silvia.  How do you quickly replace the bubble trees from Rhino with the Unreal Engine tree asset?  Would it replace all objects at once or is it a more manual process?

SILVIA: So, as Pablo mentioned, we can us Mindesk for that and we can replace it as in having the name of the tree open real and placing it in to Mindesk, it will then literally change as we speak.  It will go in the moment that you just connect everything in Mindesk.  But if you are not using Mindesk, then you just can have little dots and then paint in the pocket of the dots and you would create the trees.

Paul: Thank you.  I wanted to mention… I mentioned this right at the start.  This event is being recorded.  So, we will be posting a link to everybody, where you can watch this again.  A question on coding experience.  So, how much, and what type of coding experience is useful for people who are interested in Unreal, and is any experience like that required for Twinmotion?

PABLO: I would say none.  There is no coding required to jump in to either of these softwares.  I think that’s the beauty of them.  That doesn’t mean that you can’t go further if you want to go but…

DAVID:  The thing I like about it is you guys who use Grasshopper should be familiar.  It’s a very visual, scripting based UI that goes in to it.  You can of course go in to the coding.  This is where your expertise in… you can do a lot more with it with C++ or using the U Sharp wrapper that allows you to translate that in to C++.  You can get in to the coding but by no means do you have to, do you need to.  In fact I usually see if you have to then, we need to be working harder as Epic Games to provide that solution for you without having to code.  So, that’s our stance.  It’s blueprint and visual scripting which I think people here should be familiar with.

Paul:  Many will be.  Okay, was there a mention of a timeframe for a Twinmotion to UE bridge?

DAVID:  So, we’re saying I think, 2021, early at the moment.  We’re working on the beta at the moment with a number of people.  So, we’re exploring it but there’s no official announce day yet.  It’s just in the works.  Pablo and Silvia, I can’t remember if you were part of that but it’s in the beta at the moment.  No official announce date.  Apologies.

Paul:  Okay, there’s an interest in MegaGrants.  Are there any requirements?  Do you look for particular skill sets?  Do you look for particular experience?  Leonardo is very interested in your MegaGrants.

DAVID: Generally, there are some requirements that we look at.  Most of it is just dependent on… we care about a number of things.  First of all we care about innovation.  That’s one of the big things.  We want the idea to be innovative and generally we also are very preferable because we’re very open and we love it to be altruistic.  We love it to be able to benefit an industry or a wider group of people or by making it free, or an open source thing.  Skills that we look for in people, is only that either you have the technical skills to explore or are willing to work with parties that do.  We’ve seen a mix of MegaGrants that look to include and work with companies and developers who use Unreal Engine.  We’ve seen it from Unreal developers alone.  So, we’ve seen it across the board in a variety of different ways.  I would say the importance really comes down to the innovative and altruistic side of the proposal, the expertise does but it’s a good question.

Paul:  Thank you.

DAVID:  I look forward to seeing his proposal.

Paul: Yes, I think one might be on its way.  Is the TM / UE bridge available to the public?

DAVID: Not public beta yet, no.  We’re running it with a number of people who are obviously heavy Twinmotion and Unreal users.  At some point that will change, but again, it’s early days at the moment.  So, trust me, whenever… it will be announced and shared on all our channels when we start to have something.  With Epic Games, we’re… for those of you who are more familiar with the Engine, we’re really open with including people in our previews and early versions of things.  You can download the newest version of Unreal 426 preview, which is not the official one, but it’s the beta one for people to start exploring.  So, we are really open for people sharing these things and once it becomes available, you guys will be the first ones to know about it, because you will be made aware.

Paul:  Excellent.  Is it possible to say a little bit about hardware?  Pablo, perhaps your experience about what hardware is required for good performance with Unreal?  I guess that’s workstation rather than headsets?

PABLO:  Silvia?

SILVIA:  So, basically, we are using any Alienware or Dell Precision.  But as well, we can run it in our studio computers.  It will be a bit slower, but you can open it anywhere basically.

PABLO:  Maybe it’s beneficial for the audience Silvia, if you can maybe mention about the size of a file so the…

DAVID:  I guess that’s the important thing.  I mean, a, it depends on what you do with it.  But the instances where you see build media bringing in that level of fidelity to an entire city will obviously require a little bit more than the smaller projects.  Generally, what Silvia said is a good general all round, is that you need… I would say you’d have to have a reasonably good graphics card, but Alienware, MSI, laptops can run it quite happily and functionally all the way up to the professional grade workstations.  But that would be the one limitation that I would say, if you’re working on these masterplanning projects and want to bring in 20,000 assets, to save you time and optimisation using higher grade computers will obviously be key, but there’s also a number of great optimisation tools within the engine that allow working with large files and assets easier.  LODS, culling, all of these things will allow you to explore the models and work on the models without needing the high level graphics cards, but it varies.  It’s use case by use case.

Paul:  Okay, thank you.  Is there still a place for tools like 3D Studio Max?  It seems your visualisation workflow has changed a lot, so how do tools like that still fit in to the workflow that we’ve seen tonight?

DAVID: I can jump in on this one.  In general, as Unreal Engine, we rely on everyone’s tool sets.  If a company is using 3D Studio Max, then great.  We want to be able to support it.  The problem is with these softwares, they’re probably more tailored towards other industries or other specialities.  As the architect grows and the profession matures, the computational side is just being more dominant and a much smarter and faster way of working.  So, we’re seeing more people using that tool now than 3D Studio Max.  However, you speak to our visual production guys in the movie industry, they’ll say the complete opposite.  This is where 3D Max and Maya is maybe a little bit more dominant.  So, we don’t see it ever going away and we will continue to make adaptions in to it, but I see in the AEC and Silvia and Pablo will agree that this comp designer role, to put it in more context, in gaming, we talk about next gen consoles, all the time.  The next generation is coming.  We very much see these computational designing tools like Rhino and Grasshopper being the next gen of architects.  So, these things all come in to fruition more and more.

PABLO:  I agree.

Paul:  Thank you.  Is it necessary to have RTX cards?  I guess that’s specific to NVIDIA boards.… go on David.

DAVID:  It’s not and it is.  If you want to start to push the boundaries on real time ray tracing, then yes, you are definitely going to need RTX software.  This always comes down to what it is you’re looking to do.  What is the output?  The higher graphical, visual quality, the larger the model and asset, will usually require it.  So, there is comparison charts on the Unreal website that you can kind of see what works and what features are needing particular tools but I would say there is a use for it, and don’t rule it out if you want to do real time ray tracing which is awesome.

Paul: Is there a possibility to live link Grasshopper and Unreal Engine without Mindesk?  I think…Pablo, is this possible?  I guess there’s not going to be as nice and slick a solution as there would be in using Mindesk?

PABLO: David mentioned a couple.  So, the BHOM is one way of having this direct link.  We actually used it.  So, w had our mini project going on with Buro Happold trying to test and develop it with them and so that’s one way of doing it.  You mentioned Speckle as well.  We haven’t tested the connection with Speckle, but it seems like it’s doing it as well.  And also Rhino Inside.

DAVID:  What I was trying to get at with these different solutions is, they require different levels of expertise and knowledge.  Speckle and the BHOM, they are great but there’s a lot of understanding that you need to get around those tools and how you use them, whereas what you see done with Mindesk is a very simple, quick, efficient way of doing it, out of the box type thing.  And Rhino Inside requires knowledge of Visual Studio, and U Sharp.  So, I guess there is lots of different solutions on how to get this real time bi-directional link by Grasshopper and Unreal but it really depends on your knowledge and the area, on which solution is the best one and I think they’re all great.  It all just depends on the use case that you’ve got and the experience and time you have to invest in to them.

Paul:  Questions are flying in at the moment.  How is the lighting typically handled in the scenes today?  Is it baked, or real time?

DAVID:  How do you guys do it in Heatherwick?

SILVIA:  If we are showing the real, we don’t bake it, but if we are doing a proper animation, yes.

DAVID:  We’ve seen a lot of new tools coming out in this area.  I think there was a light mask rendering tool which is being released with 4.26, which allows you to have a more real time approach on base lighting quality. But it always comes down to optimisation of the Unreal asset.  So, if you are trying to make your Unreal files more streamlined and get those high frame rates, dynamic lighting is very tough to do, because it requires higher graphical patterns.  Baking your lighting in, although it’s fixed, creates on a higher level of fidelity, which if anyone has seen Unreal Engine 5 trailers, knows that’s what Unreal Engine 5 looks to completely blow out the water, by allowing that high quality global illumination, control in real time rather than have to bake it in. So, that will soon be a thing of the past, but room for both.

Paul: Question for Pablo.  How is Heatherwick accounting for occlusion culling in AR scenes and the question goes on, ARCore uses depth API and ARKit is using LIDAR.  Are you implementing these new API’s in your process?

PABLO:  So, I think rather than going through the details of each part of that question, I think it’s better if I describe how we are using AR at the moment.  So, we are using Fologram to connect Rhino with HoloLens and mobile devices.  So, for that, we are controlling absolutely everything through Grasshopper.  But we are also starting to use ARKit.  So, since the beginning of this year, we have been testing different platforms and ARKit is one of the ones that we like the most which is something that the person in the question also mentioned.  And the bespoke tool that I presented, which is a web based tool, it’s again the product of a one day hackathon.  So, it’s really on its early status.  It’s in sketch mode and so basically we will be addressing all these questions as we go ahead with the project.

Paul: And a linked question for David, is Unreal working on blueprints to account for these new occlusion culling technologies?

DAVID:  Probably not blueprint things, but again, I think this is something that will come in to more the Unreal Engine v5 capabilities.  But not that I’m aware of is there direct focus on it, but whenever there does, it will be more feature of the Engine rather than a particular blueprint option.  I would hang fire on that one.

Paul: Okay.  Those questions were from Chance.  Now, there was something else from Gabrielle for you David.  Is there an expected timeframe for when Unreal 5 beta will be available for partners and developers?

DAVID:  Again, unfortunately not.  It’s always a fun conversation to have with people.  Everyone wants to get their hands on it first.  We’re not… even in 2021, there’s not an exact timeframe that we’ve set for, beta comes out here, early release gets out here.  Again, the only thing I can tell you is that whenever we know you’ll know, because we make all these things very public.  The reason it’s not at the moment is because there’s nothing publically out there yet on this.  So, hang fire and keep an eye on the news is what I say.

Paul: Question from Reece, maybe for Silvia, can you mix and match assets coming in from Rhino with assets in Maya and 3d Studio Max?

SILVIA:  Yes, it’s like… if you are importing geometry from different softwares, that’s the question, right?

Paul:  Yes, what assets, so I guess geometry?

SILVIA:  Yes, obviously.  It depends how you’re importing them but you can import from every software and you can all have them in Unreal.  That’s the beauty of it.

Paul:  Another workflow question.  Is optimisation for the mesh distance fields that Unreal Engine uses, handled in Rhino, Datasmith, or is there a manual control for that?

DAVID:  I’m not quite sure on the part that refers to.

Paul:  Reece, if you want to word the question in a different way, I’m happy to ask again.

PABLO:  Is it asking where the optimisations happen?

Paul:  I’ll go on to something else and come back.  We have a poll for everybody and I wanted to mention a couple of things.  Thank you to all of the presenters.  Thank you Pablo, thank you David, thank you Silvia, fantastic presentations.  Thanks for joining us.  We will be back and we have something scheduled… although we don’t have a date yet, Grimshaw have agreed to present at the next version of this meeting which is hopefully going to happen in January, maybe sooner.  But you’ll be the first to know.  Do please either sign up for our newsletter or follow us on social media at Simply Rhino and you’ll hear about all this stuff.  As you know, we do Rhino training and we supply Rhino software, just to mention a couple of classes that are coming up there, Rhino Level 1, Grasshopper classes, and more advanced Rhino classes, all of which have been delivered live and online at the moment.

So, shall we go to Silvia and Silvia’s little explanation of Mozilla is going to work for us?

SILVIA:  How do we send the links?

STEPH:  I put the links in the hand outs.  So, there is a PDF in the handouts that has the links to the rooms in it.  Hopefully people will be able to see that.

SILVIA:  Right, so just if you guys want to have a little talk with us, and between you guys as well, just click on the links provided.  We have four different rooms.  The first one is The Vessel and the other ones are magic places that you can click and then we can enter through each room.  Just press enter the room.  You choose an avatar, I’m going to be Santa Claus if you want to find me.  You just enter the screen, allow your microphone and then you can navigate inside of each room as the same controls as in Unreal, A for left, d for right, w for front and s for backwards.  You can pan with your left mouse click and you can jump in to places with your right mouse click.  If you are close to an avatar, you will be hearing the voice closer and if you are far away, you will not be able to hear anything.

PABLO:  Can you also copy and paste the links on the chat box, because I don’t think we have the hand outs actually.

STEPH:  I can do that.

PABLO:  Are we all going to be in the same room or are we splitting?

Paul:  I’m going to go to the Spanish Port, because that sounds like Barcelona and I think I know who will be there.

SILVIA:  I am there.  So, I can see a lot of faces now.  This is quite fun.

Paul:  So, we’re going to close down this window, say goodbye from here and we’ll see you at Spanish Port, or the Vessel, or these other places very soon.

The post AR/VR for Rhino and Grasshopper UK UGM | October 2020 appeared first on Rhino 3D.

]]>
Grasshopper UK UGM | November 2019 https://rhino3d.co.uk/events/grasshopper-uk-ugm-november-2019/ Wed, 23 Oct 2019 13:24:21 +0000 https://www.rhino3d.co.uk/?p=1580 Grasshopper UK UGM – Special Updates from McNeel (RhinoInside, Sub-D, Rhino 7 and more..) Join us and McNeel at AKT II for this special Grasshopper User Group meeting with updates […]

The post Grasshopper UK UGM | November 2019 appeared first on Rhino 3D.

]]>
Grasshopper UK UGM – Special Updates from McNeel (RhinoInside, Sub-D, Rhino 7 and more..)

Join us and McNeel at AKT II for this special Grasshopper User Group meeting with updates from the team at McNeel, including details on RhinoInside, Sub-D, Rhino 7 and more.

AKT II will kick off the evening’s proceedings before we move on to McNeel’s presentation and an extended Q&A with you and the McNeel team.


 

  • Wednesday 20th November 2019
  • 18:30 – 20:30
  • AKT II, White Collar Factory, 1 Old Street Yard, EC1Y 8AF
  • You can book your place at the meeting HERE via Eventbrite
  • We kindly ask that all interested in coming to the meeting book only 1 ticket per person – space is limited and we want to have as many of you along as possible – thanks!

 


 

Presenting at the meeting:

AKT II p.art team will present a selection of recently completed projects and applied research in the fields of advanced manufacturing, Mixed Realities, and IOT performance sensing.


 

McNeel on RhinoInside, Sub-D, Rhino 7 and more…

  • Rhino/Grasshopper Inside Revit, Unreal Engine and more

Rhino.Inside® is an open source Rhino WIP project which allows Rhino and Grasshopper to run inside other 64-bit Windows applications such as Revit, Unreal Engine, etc.

  • QuadRemesher

  • Sub-D Modeling

Subjects that can also be discussed:

  • Cycles – Realtime Raytrace Renderer
  • Rhino Compute Service

 


Thanks to AKT II for hosting this meeting. You can see photos from our last Grasshopper meeting at Heatherwick Studio in July here.

The post Grasshopper UK UGM | November 2019 appeared first on Rhino 3D.

]]>
Grasshopper UK UGM | July 2019 https://rhino3d.co.uk/events/grasshopper-uk-ugm-july-2019/ Thu, 04 Jul 2019 11:07:02 +0000 https://www.rhino3d.co.uk/?p=1397 Grasshopper UK UGM – Special Update from McNeel (RhinoInside, Sub-D, Rhino 7 and more..) Join us and McNeel at Heatherwick Studio for this special Grasshopper User Group meeting with updates […]

The post Grasshopper UK UGM | July 2019 appeared first on Rhino 3D.

]]>
Grasshopper UK UGM – Special Update from McNeel (RhinoInside, Sub-D, Rhino 7 and more..)

Join us and McNeel at Heatherwick Studio for this special Grasshopper User Group meeting with updates from the team at McNeel, including details on RhinoInside, Sub-D, Rhino 7 and more.

Heatherwick Studio’s Ge-CoDe team will kick off the evening’s proceedings before we move on to McNeel’s presentation and an extended Q&A with you and the McNeel team.


 

  • Tuesday 23rd July 2019
  • 18:30 – 20:30
  • Heatherwick Studio, 356-364 Gray’s Inn Road, London, WC1X 8BH

 


 

Presenting at the meeting:

 

Heatherwick Studio Logo

Heatherwick Studio’s Ge-CoDe team will be presenting their on-going research on mixed reality and emergent technologies applied on recent projects.

 


 

McNeel – the presentation from McNeel will include:

  • Rhino & Grasshopper in AEC and a few User Projects

 

  • Development platform and Tools

  • New Developments & Frameworks (Rhino WIP)
    • QuadRemesher
    • Sub-D Modeling
    • Cycles – Realtime Raytrace Renderer
    • Rhino VR
    • Rhino Compute Service
    • Rhino/Grasshopper Inside Revit: Rhino.Inside® is an open source Rhino WIP project which allows Rhino and Grasshopper to run inside other 64-bit Windows applications such as Revit, AutoCAD, etc.

 


 

 

Thanks to Heatherwick for kindly hosting this Grasshopper UGM.

 

Thanks to everyone who joined us – great night, again! Check out some photos of the evening in this gallery:

 

The post Grasshopper UK UGM | July 2019 appeared first on Rhino 3D.

]]>
Rhino UK User Group Meeting 2019 https://rhino3d.co.uk/events/rhino-uk-user-group-meeting-2019/ Fri, 12 Apr 2019 15:09:40 +0000 https://www.rhino3d.co.uk/?p=1209 Simply Rhino presents the Rhino UK User Group Meeting 2019 which will be taking place on Wednesday June 12th 2019. Book tickets now.

The post Rhino UK User Group Meeting 2019 appeared first on Rhino 3D.

]]>

Simply Rhino presents the
Rhino UK User Group Meeting 2019

 

We’re pleased to announce that the Rhino UK User Group Meeting will be taking place on Wednesday June 12th 2019.

Join us and the wider Rhino3d community at this popular event and watch presentations from Rhino users, plus get hands-on with live software demonstrations from our exhibitors.

Date: Wednesday 12th June 2019
Time: 09:00 to 17:00
Location: The Crypt on the Green, Clerkenwell Close, London EC1R 0EA

Timed agenda for the Rhino UK UGM is available now – download it HERE.

Thanks to all who helped make the meeting a great day
– a write up of the event and photo gallery are available here.

Rhino UK UGM Presenters for 2019 include:

[ + ]

Rhino Roadmap

Carlos Pérez Albà - McNeel Europe

An insight into McNeel's culture and new developments, including:

  • Company Update and Markets
  • New in Rhino 7 WIP (Sub-D, Cycles, YAK, Gradient Hatches, Single Stroke Fonts)
  • New Developer Tools and Frameworks (Compute, Rhino Inside, Rhino VR)
  • Ecosystem (Community, 3rd Party Plug-ins, Workshops, Events)


Carlos Pérez Albà – studied Economics (University of Barcelona and Otto-Friedrich Universität Bamberg) and Illustration (Escola Massana Barcelona) and joined the Customer Support team at McNeel Europe in October 2000. Currently in charge of Sales, Marketing & Business Development for EMEA region and food4Rhino Product Manager.

http://www.mcneel.eu/

[ + ]

A new generation of racing goggle developed with the new generation of athletes – Fastskin Pure Focus goggle

Chris Johnson - Pentland Brands

Fierce, fearless, fast. The Fastskin Pure Focus goggle has been designed in collaboration with the world’s top athletes. The goggle delivers a 5% reduction in drag, making it Speedo’s most advanced and fastest racing goggle ever. Chris Johnson presents the design development story of the Fastskin Pure Focus goggle.


Chris Johnson is the Industrial Design Lead for Pentland Brands, with responsibility for leading the design creation of equipment and footwear for the Speedo, Canterbury, Mitre and Berghaus brands. Chris has more than 20 years’ experience working within design and innovation consultancies and global sports brands. His work at Speedo spans four different Olympic Games product development cycles and has resulted in Red Dot design awards and multiple sports technology patents relating to garments, equipment and digital devices. He holds a bachelor’s degree in transport design from Coventry University and an MBA from Durham University Business School.

https://www.pentlandbrands.com

[ + ]

Translating complex geometry for real-world fabrication: a Heatherwick Studio approach

Pablo Zamorano - Heatherwick Studio

Showcasing the processes behind their world-renowned designs, explore how Heatherwick Studio is engaging with emergent technologies and utilising Rhino and Grasshopper – in the realisation of recently completed projects including New York’s Vessel, as well as their current explorations of mixed-reality in construction, to enable collaboration with local craftsman, ensure quality throughout the build process and allow designs to be pushed to their limits.

Photo credits
1. Vessel Interior - courtesy of Michael Moran for Related-Oxford
2. Vessel Interior 3 - courtesy of Getty Images
3. Vessel with The Shops _ Restaurant at Hudson Yards - courtesy of Michael Moran for Related-Oxford


Since joining Heatherwick Studio in 2015, Pablo has been instrumental on Projects and the Geometry and Computational Design group.

As Head of Geometry and Computational Design at the studio, Pablo works across all studio projects providing expertise and guidance on new technologies and techniques, and the execution of challenging geometries.

He was Deputy Project Leader on Coal Drops Yard, an award-winning retail quarter and public space in King’s Cross, London. He oversaw all packages on the scheme and coordinated their delivery between the studio and the executive architect during Stage 3 and Planning. From Stage 4 onwards, he focused on assisting the team to take the design through contract and later into construction. He was key to the development and fabrication coordination for all complex geometry related areas on the project, most notably the Upper Level.

Prior to working at the studio Pablo was based at SOM London where he worked on a number of projects and competitions of multiple scales in Europe and the Middle East. One of his most notable projects was a high rise development of five towers with over 1,000 residential units in London’s Greenwich Peninsula, currently under construction.

Before relocating to London in 2010, his career was split between New York and Santiago, Chile, where he won international competitions and completed several projects. Pablo has lectured widely, and his personal work has been published and awarded internationally.

Selected Projects

Coal Drops Yard, London

Qualifications

MSc, Emergent Technologies and Design, Architectural Association, London, 2011

Architect License, Universidad Central de Chile, Santiago, Chile, 2004

Bsc (Hons) Architecture, Universidad Central de Chile, Santiago, Chile, 2003

[ + ]

The theory of a synthesis that brings together art, architecture and design

Lee Simmons

Lee Simmons is an award-winning Artist with a portfolio which includes high profile commissions that can be found in the heart of London to private collections. Lee’s work draws inspiration from the context or place where an object will be used or observed. Crossing the boundaries between artist, craftsman and designer, Lee regularly employs numerous practical skills as well as the latest cutting-edge technologies to convey all his bold and innovative ideas.

All presentation images © Lee Simmons


Raised in Stevenage, Lee Simmons gained a First-Class Honours degree in metalwork and silversmithing from Sheffield Hallam University before undertaking an MA in the subjects at the Royal College of Art. He works from his recently-built studio in Hitchin, Hertfordshire.

His existing work can be found at 66 Wigmore Street London, in the Royal College of Art’s private collection, to date his clients have ranged from museums, galleries, public institutions, livery companies and Great Estates. Lee has won numerous awards including the New Designers Award 2009 for the best graduate in show and various awards with The Goldsmiths Company.

Lee’s portfolio of work can be found at www.leesimmons.com

https://leesimmons.com/

[ + ]

Ennoble for Crystal & Gemstone Designs | Swarovski

Graham Hench - Swarovski Professional

Ennoble for Rhino3D “makes your life and your designs better” when working with crystals and gemstones. Originally developed for the design and engineering teams across the Swarovski business units – now six years in the making – Ennoble has been released for external beta testing. This talk will introduce Ennoble in the context of digitization of the jewelry / fashion / luxury industries, driven by tools such as Rhino3D. We’ll introduce and demo some of the basic Ennoble feature set – the connected Swarovski digital catalogs, automated placement, compiled Bill of Materials, cavity creation, automated structural creation, and optimizations for product visualization, rapid prototyping, and metal production – and show highlights from finished jewelry and fashion accessories created using Ennoble.


Graham Hench has a degree in Quantitative Methods & Computer Science from the University of St. Thomas and over 10 years of experience in information technology (research and development, business development, management, and corporate innovation), currently working as the Associate Director Digital Ventures for Swarovski Professional, headquartered in Wattens, Austria.

https://www.ennoble.io

[ + ]

Creating the kilometre-long parametric ‘Flow Wall’ for Turkish Airlines at the new Istanbul Airport

Oliver Salway - Softroom

Softroom used a combination of Rhino, Grasshopper and VR to design a kilometre-long installation. the ‘Flow Wall’ which unites nearly 19,000 square metres of lounges being completed by the architects for Turkish Airlines at the new Istanbul Airport. At over 1,000 metres in length, the Flow Wall is one of the longest parametric interior forms in the world. It is the result of an international competition won by Softroom to deliver lounges that embody Turkish Airline’s new brand philosophy of ‘Flow’. Softroom founder Oliver Salway will discuss the creative process, and lessons learnt from the its fabrication and installation.

Photo of the presentation piece is © Ikoor


Oliver Salway is a founding director of Softroom, an award-winning architecture and interiors studio based in central London. The practice has developed a particular specialism for sculptural interiors across a broad range of sectors, including hospitality, commercial and cultural spaces, as well as large-scale retail projects. Softroom has worked with some of the biggest commercial brands in the world, alongside grand public institutions. Major clients include the British Museum, V&A Museum, Eurostar and Virgin Atlantic, and awards range from a D&AD Yellow Pencil to the Stephen Lawrence Prize.

[ + ]

Sanctuary. Wherever you need it.

Karl Lenton - Seeds

The original inspiration for the Seedpod was born from our work in the public sector. Where we found that there was often tension between the need for privacy, pressure on space, and constraints on budgets. Specifically we wanted to help NHS health professionals serve their patients and staff better. So we designed the Seedpod, a portable private space where professionals and patients could meet for privacy, one-to-one conversations, mindfulness and more.

We soon realised that what works in one complex environment can be valuable in others. Seedpods now provide private, calming space for innovators, place makers, educators and exhibitors.

A beautiful, versatile solution to the universal need for nurture.

Rhino 3D modelling software was used throughout the project, from initial concept design drawings and models used to test various key design fragments to full scale manufacturing 3D models. Rhino 3D software transformed an initial pencil sketch into a fully developed market ready product.


Karl Lenton spent six years in architectural education in Leeds and London, winning the John CASS Award for Social Entrepreneurship in 2014. He is the Creative Director at Seeds and co-founder of architectural practice Burr Lenton Architecture (BLA). Karl was shortlisted for the RIBA South West Regional Live Project Award in 2014 and long listed for the RIBAJ MacEwen Award: Architecture for The Common Good in 2016 for his work on projects for vulnerable people. Karl is an appointed member of LeedsBID (Business Improvement District) designing public realm interventions that improve people’s wellbeing, productivity and happiness. Karl is the co-author of The Free Prisoner which he presented at Oxford University and RIBA. He has taught at Leeds Becketts University and has been a guest critique at MArch (Moscow School Of Architecture), The CASS, Brighton University and Central Saint Martins.

[ + ]

Kangaroo – what’s new?

Daniel Piker

Kangaroo (a plug-in for Rhino) is a Live Physics engine for interactive simulation, form-finding, optimization and constraint solving. Daniel, the developer of Kangaroo, will update us on what's new and what's coming in soon to be released versions of this plug-in.


http://kangaroo3d.com/

[ + ]

Simply Aircraft Interiors

James Woodington - Safran Seats GB

The presentation will cover the following;

  • How we have developed our skills and capabilities using a combination of new software. (Rhino, VRAY, Bongo, Unity and Enscape)
  • How we integrate the new software into our design process, who uses the software and how we use it.
  • How we have used virtual reality plugins to aid designers early on in the design process.


James Woodington is a Senior Industrial Designer at Safran Seats GB, James designs business class and first class aircraft interiors and has done so for 5 years. He was lucky enough to work on projects such as Skyroom for Singapore Airlines and the Optima platform which already has a number of customers. Additionally he is responsible for maintaining and improving the visual capabilities within the design department. With recently the integration of mixed realities into the company, which has already seen massive savings in the design process.

[ + ]

Wild Design Studio

Carlos Bausa & Dirce Medina - Wild Designs

Wild Design Studio works in the intersection between architecture, digital fabrication, and art. We are a group of architects specialised in computational design and prototyping, that started dreaming Wild a year ago. Our passion is designing shapes with different languages than the ones used in our normal architectural work environments, so we decided to start working out our own design style in our free time. Slowly the chaotic design ideas we had in our minds started to settle taking shaped within WILD DESIGN STUDIO, a creative firm, focused on small scale design objects and architectural artistic installations. We use V-Ray with and algorithmic modelling tools within Grasshopper for Rhino to envision our ideas and inspire people with the designs we love producing, and we use metal 3D printing techniques to bring them to life and give something different from the people, beyond the routine of everyday life.


CARLOS BAUSA
I’m an architectural designer that works in the intersection between architecture, art, design, and academia. Thanks to parametric 3D modelling, and rendering tools, I shape my thoughts, and visualize them prior bringing them to life. I worked in Foster and Partners for the last four years and before I had the chance to collaborate with another architecture studio from Barcelona called External Reference. I have taught computational design in schools like the Institute of Advanced Architecture of Catalonia and the Institute of European Design in Barcelona.

All these years helped me to gain experience in the world of visualization and 3D modelling, and thanks to that it widened my vision regarding to what an architect can do with these tools. I have always felt the need to create new shapes and objects that could have an artistic touch, digitally fabricated, or hand crafted. I like getting in touch with the materials and explore the best way to be able to do it myself, at a scale that can be feasible for a person to manipulate all the components.

Because of that need of crafting things and creating design, in parallel to our professional work, my wife and I decided to set up our own design firm, Wild Design Studio. We created this firm to put together our design experiments and ideas. Explore design options with freedom and without corporative constrains, being able to create our own shapes and fabrication ideas. Thus being able to craft and visualize them at our own peace, in the ways we like the most.

DIRCE MEDINA
I’m an architect by definition but I prefer the term “designer” [in the making] as it suggest a wider range of areas you can move across, going through design, architecture, art and leaving room for more to come. The combination of rendering, parametric and fabrication tools, among others, have allowed me to shape my ideas, and having worked through the years in different studios have contributed greatly to this process, starting in Barcelona at EBMT, later in Mexico at FR-EE and currently as part of Heatherwick Studio.

The exposition to a variety of projects, have increased my personal interest for material experimentation and object crafting. Wild Design Studio emerges as the space for this curiosity with the commitment to test materials, fabrication and ideation processes and ultimately to test design to its limits

Rhino UK UGM 2019 Event Partners and Exhibitors:

The post Rhino UK User Group Meeting 2019 appeared first on Rhino 3D.

]]>
Grasshopper UGM | Manchester | April 2019 https://rhino3d.co.uk/events/grasshopper-ugm-manchester-april-2019/ Thu, 21 Feb 2019 16:07:16 +0000 https://www.rhino3d.co.uk/?p=1066 With Arup and Bauman Lyons For our second Grasshopper User Group Meeting of 2019 we are meeting at Arup’s Manchester offices. Grasshopper, now built into the popular 3D Modelling CAD […]

The post Grasshopper UGM | Manchester | April 2019 appeared first on Rhino 3D.

]]>
With Arup and Bauman Lyons

For our second Grasshopper User Group Meeting of 2019 we are meeting at Arup’s Manchester offices.

Grasshopper, now built into the popular 3D Modelling CAD tool Rhino v6 for Windows, is widely used with the AEC industry with growing interest in Product and Industrial Design, Furniture and Jewellery Design.  For those exploring and maybe new to the subject or for veteran users, we invite you to come along and hear from / network with key industry users.  More on Grasshopper here

The meetings follow a simple format of at least one presentation from a customer with experience in this field, followed by group discussion and informal pleasantries.

Confirmed Presenters are our hosts, Arup, and architects Bauman Lyons.

Details

  • Thursday 4th April 2019
  • 18:30 – 20:30
  • 6th Floor  3 Piccadilly Place  Manchester M1 3BN United Kingdom (Map here)

 

Heads up! Read the write up notes from the meeting at the foot of this page.


Presentation by Arup

Our presenters are Daniel Edmiston and Mark Joynson – Structural Engineers | Building Engineering | Arup

Stadium image showing use of Grasshopper by Arup

Our presentation focuses on why and how we utilise Grasshopper. We describe our initial experiences in implementing the tool in the design development of two of the stadiums currently under construction for the 2022 FIFA World Cup™, the Iconic Stadium in Lusail and the Qatar Foundation Stadium in Doha, and then outline how we have since developed the skills and tools on various other projects, including its application at early concept stage, and our proactive approach to up-skilling the team across the region.


Presentation by Bauman Lyons

Our presenter is Matt Murphy – Architect | Bauman Lyons

MassBespoke is a parametrically driven timber cassette construction system that integrates structural design and fabrication into an automated system that can be accessed by architects from outline design stage to produce bespoke buildings through distributed local CNC fabrication.  

Mass Bespoke Parametically Driven Timber Cassette Construction Units

Primarily aimed at ordinary (rather than extraordinary) construction, Grasshopper has given us the capability to handle the complexities that arise from creating a single system that can iterate endless different design inputs.

We feel this gives rise to a paradigm shift for everyday bespoke construction – in particular the level of R&D that can be afforded as it is invested into a reusable system, rather than the reinvention of the wheel.  

Street Elevation Image for Mass Bespoke

 

Our goal is to create an easier way for architects to add value through good design without the drawback of added cost and uncertainty of bespoke construction and to enable a clearer path for manufacturers to sell their products to the end user through the future possibility of integrated compatible systems.

Mass Bespoke Workflow image

The presentation will discuss different stages of our work flow, including some of the different tools & plugins we are using in our Grasshopper definitions, as well as how we’re starting to improve data management and output verification as we move towards mortgage-backed accreditation of MassBespoke construction. 

 


 

Notes and photos from the meeting:

First up was a joint presentation from Mark Joynson, an Associate from Arup Liverpool, and Daniel Edmiston, a Senior Engineer from Arup Manchester.

Back in 2014 Mark begun work on the Health and Wellness Stadium for Qatar 2022.  Largely self-taught as Grasshopper was not widely used within the Liverpool office at that time, linking Rhino, Grasshopper plus the In-house Grasshopper plug-in Salamander (which itself allows further links to Structural Analysis tool GSA) and then passing this phase of work onto Arup Madrid.

Mark then discussed work completed on the Foster + Partners designed “Lusail Iconic Stadium” which is currently under construction for the 2022 FIFA World Cup in Qatar.  Arup’s focus for this project included the main 3 stadium components of Roof, Vessel & the Bowl.  All models from Architect were in Rhino format (as that was the primary design tool), Grasshopper use came later and after concept stage as the mesh became more defined within the Scheme stage.  Instead of exchanging 3d models with Architect they exchanged Grasshopper scripts all resulting in greater efficiency through a shared Parametric design approach.  For the Bowl, Digital Project also Rhino with Grasshopper and again links through Salamander to GSA for Static, Dynamic and Thermal analysis all played an important part.

Following these project successes Daniel mentioned the formation of the “Grasshopper Club” a group which has met on 26 occasions so far throughout the UK at various Arup offices, within these sessions they discuss, experiment and run competitions all to encourage the sharing of expertise and to encourage the use of Grasshopper.

3rd project was an international contest of ideas for roofing the Roman Amphitheatre “Verona Arena”.  The scheme featured a retractable cantilevered roof which had to be exceptionally considerate to the existing structure.  Geometry Gym translation tools were used to share the model with Analysis tools allowing the structural concept to be interrogated both visually and analytically.

We then heard of “The Loop” a curved 500m pedestrian and cycle bridge structure over varying terrain linking Lilleaker and Lysaker in Oslo, Norway where Daniel and Arup worked on the feasibility and concept design.  Grasshopper plug-in Human UI successfully assisted with the creation of a simplified interface for those with less Grasshopper experience but still required to perform a design coordination role.

Lastly Daniel presented Speckle as a solution for Rhino / Grasshopper users and far more, all potentially co-ordinating seamless workflows between Disciplines, Practices and Technologies.  An approach and solution we will be hearing much more about with future projects!

 

 

Our second presentation was from Matt Murphy from Bauman Lyons Architects based in Leeds and also now a Director of MassBespoke which is now a fully fledged startup of it’s own.

Massbespoke is an insulated timber cassette system combining the benefits of Mass production with Bespoke design.  From conception in 2012 the requirement for Parametric controls were clear, Grasshopper features centrally within the Massbespoke workflow but it’s unlike where you would perhaps normally associate the tool, there’s no grand or signature Architectural output, a stadium from Arup as per the previous presentation for example, you could say it’s far simpler in the area or focus of output which is housing but the complexity lies in being able to handle any number of possible housing formats.

Using Elefront to communicate with fabricators on a wide range of CNC / CAM solutions, including very high throughput facilities on Scottish docks where the timber lands from around the world for processing on industrial CAM equipment on a huge scale.

ArchiCAD models utlising the tailored Grasshopper link enable the models to be BIM compliant but also assist in the co-ordination of M&E services.  This BIM compliance for the models allows for full support for Design for Manufacture and Assembly (DfMA) processes leading to high levels of quality control.

Real time costing is enabled via the use of OpenNest and Grasshopper, linked to live spreadsheets maintained by the fabricator which include personnel rates, general overheads, supply costs for materials etc.  Matt explained they are currently seeking BOPAS accreditation for Mortgage approval status, all so high street lenders can approve mortgages for Massbespoke built housing projects.

Quite a range of 3rd party plug-ins for Grasshopper are utilised by the Massbespoke team including those already mentioned plus Anemone, Speckle, Lunchbox and Bowerbird plus others

 

The post Grasshopper UGM | Manchester | April 2019 appeared first on Rhino 3D.

]]>
Grasshopper and AR/VR for Rhino UGM – February 2019 https://rhino3d.co.uk/events/grasshopper-and-ar-vr-for-rhino-ugm-february-2019/ Thu, 13 Dec 2018 12:33:06 +0000 https://www.rhino3d.co.uk/?p=1042 With Grimshaw, Fologram, Foster + Partners and Chaos Group For our first User Group Meeting of 2019 we are combining our usual Grasshopper UGM with our VR/AR user group. The […]

The post Grasshopper and AR/VR for Rhino UGM – February 2019 appeared first on Rhino 3D.

]]>
With Grimshaw, Fologram, Foster + Partners and Chaos Group

For our first User Group Meeting of 2019 we are combining our usual Grasshopper UGM with our VR/AR user group.

The group is for those who are interested in meeting in order to network, discuss and explore Grasshopper3d and virtual and augmented reality solutions for Rhino3d.

The meetings follow a simple format of at least one presentation from a customer with experience in this field, followed by group discussion and informal pleasantries.

Confirmed Presenters are Grimshaw, Fologram, Foster + Partners and Chaos Group.

Details: This meeting took place on Thursday 21st February 2019 at Grimshaw, 57 Clerkenwell Road, London, EC1M 5NG

Special thanks to Andy Watts and the team at Grimshaw for hosting this latest UGM.

Meeting notes are available at the bottom of this page.



Preceding this UGM there was a 3-day workshop with Fologram on 19th, 20th and 21st February 2019 at Grimshaw – find out all the details about this workshop here on the Simply Rhino site.


Presentation by Grimshaw

Within the Design Technology team at Grimshaw, the VR and computational design are key components of the work our team undertakes to support our design teams and research new ways of working.

Grimshaw VR Demonstration Shot

The use of VR has grown to become an well-established part of the design toolset at Grimshaw from internal reviews and design checking through to client presentations and stakeholder engagement. More recently, AR has shown the potential to introduce a new facet to this, overlaying our design information on a more readily understandable physical context, be it at full scale or otherwise.

From projects such as Waterloo International through to Dubai 2020 Expo opening next year, the work of Grimshaw has always had a strong relationship with computational design. Today, tools such as Rhino and Grasshopper are now integral to our everyday work.

Grimshaw VR Demo on Tablet

Recently, our in-house Design Technology team had been looking at ways of merging these two key work-streams together. Whether through bespoke in-progress workflows or through the use of more developed tools such as Fologram, we are actively seeking ways to enable our teams to harness the power of computational design tools such as Grasshopper in an immersive 3D environment.


Presentation by Fologram

Fologram is a toolkit that allows designers to build interactive mixed reality (MR) applications quickly within Rhino and Grasshopper. By providing users with access to device sensor data (spatial meshes, gesture events and computer vision tools running on camera feeds) as inputs to parametric models, the full ecosystem of Grasshopper plugins (physics simulations, structural and environmental analysis, machine learning etc etc) can be extended to run in mixed reality. Gwyllim Jahn and Nick van den Berg will demonstrate applications developed with Fologram by partners and clients that augment existing processes of design, modelling, analysis and making.

Using Fologram with mobile phone to design in Mixed Realities

Designing and making within mixed reality environments extends the skills and capabilities of designers and builders by improving spatial understanding of design intent and reducing the risk of human error associated with extrapolating 2D instructions to 3D form. These new capabilities dramatically improve the ability of conventional craftsmen and construction teams to fabricate structures with significant variability in parts, form, structure, texture, pattern and so on, and in many cases completely reverse design viability as impossibly expensive and difficult proposals become straightforward, low risk and cheap. Complex designs can now be fabricated on standard building sites, with cheap materials and tools, and without expensive expertise or design documentation.

Using Fologram to design in Mixed Reality

We will discuss work from Fologram that investigates the implications of MR assembly methodologies on architectural design through the lens of several architectural prototypes. Could making in mixed reality allow us to refigure CAD-CAM not as a means of working to high degrees of tolerance and precision but instead as a return to craftsmanship, intuition and reflexive making? How will the medium of MR enable new forms of collaboration between designers and manufactures, or between humans and machines? What new architectural forms might be found in this superposition of the digital and the craftsman?

Working with Fologram in Mixed Reality

At the end of the presentation there will be the opportunity to have a brief demonstration of the Fologram toolkit on the HoloLens and mobile phones, and discuss applications within research, teaching and practice.

Check out Fologram’s vimeo channel to see Fologram at work.

 


Presentation by Jonathan Rabagliati from Foster + Partners

The Bloomberg Ramp | Rising through the centre of the building, the distinctive hypotrochoid stepped ramp, animates the whole Bloomberg office space.  Fabricated with a steel monocoque, the ramp is clad in bronze panels. Their form is based on a mathematical curve called a hypotrochoid, that forms a smooth continuous three-dimensional loop between that rises up to the skylight. Each loop cuts through a near elliptical opening in the floor plate and these elements, rotating through 120 degrees on each level, which creates dramatic views that open deep into the building.

The ramp is central to the way Bloomberg chooses to operate, embodying a sense of movement and dynamism through its form and function. The ramp is conceived as a place of meeting and connection, between people and parts of the office. As the primary connection between the floors, it acts as a great social condenser for the building, bringing both life and light to the building.

The presentation by Jonathan Rabagliati charts the story of design through fabrication, using computation design, VR, laser scanning and metrology and close collaboration with structural Engineers and contractors to realise a remarkable design.

 


Presentation by Chaos GroupV-Ray Next: Immersing in Parametric Design

With 15 years on the front of expanding the possibilities of visualisation, Chaos Group have grown to have groundbreaking performance as the readily expected feature of every subsequent release. And moved to pushing the limits of what is generally possible to visualise. No better testament to that than V-Ray’s latest Next line, even more so – with the expected V-Ray Next for Rhino.

On February 21 CG Specialist Lyudmil Vanev is coming on stage with something way more powerful than a new version presentation, however exclusive it may be. Lyudmil is showing our whole take on the way designers can be seeing and experiencing their design. And adding substance to our concept of visualisation as a design tool, with an integral role in every stage of the design process.

You will get a detailed exclusive preview of the way how V-Ray Next for Rhino adds completeness to an approach we started in V-Ray 3 – the direct access to rendering from within Grasshopper, without the need to exit, bake, etc. V-Ray’s entry directly in the parametric toolset moves towards providing interactivity and a new depth of immersion and understanding of changes, parameter impact, and design evolution. Through simply giving a new set of eyes to the designer – to see all right where it happens, straight within the parametric script, changing in real time and with the most realistic materials, if needed. Which also makes the above the main reason and entry point into interactive immersive virtualised design with V-Ray Next for Rhino.
Operating straight within Grasshopper, V-Ray brings its complete list of features and powers, straight to rendering animations and supporting VR Scans. furthermore, it brings two major opportunities – GPU rendering for speed and computing power; and bridging to V-Ray for Unreal, to provide a seamless transition from the parametric plugin into interactive virtual setups.
So – interactive, fast, realistic, gamified parametric design. Firsthand, first time and with a hint on the next areas of research and development straight from the team.


 

 


Grasshopper3d and AR/VR for Rhino3d Meeting Notes | Grimshaw | February 2019

Georgios Tsakiridis | Grimshaw | How does VR make a difference in the design process?

VR and AR sit within the research cluster of Grimshaw’s Design Technology department, where they have champions of areas of interest and research. They try to be first adopters, and had an AR exhibition about four years ago and now use 360° VR scenes as a standard project deliverable. They also have a VR cave available when the project can justify it.

VR as a new way of working : In general, a small group within the practice will explore new technologies first and then roll them out across the practice. But VR can be a relatively simple technology which everyone see results quickly, so does it change what and how people design?

They set up rooms for Vive rigs and gave small headsets like the Samsung Gear to design teams. The use of these simple headsets mean that the design teams use them in the design process itself, and not just in client reviews. They started with simple workflows, using N-Scape and Iris VR, offering quick and reliable output and being accessible to the team.

For stakeholders, VR gives unmatched clarity, without the distortions inherent in CGIs.

MIPIM was a key first showcase, where they demonstrated a model of their Dubai Sustainability Pavilion, but the ‘Heathrow Horizon Community’ engagement was perhaps more important because they created a set of 360s of key passenger journey points. The ‘Horizon’ are a group of frequent flyers who were shown VR scenes of a generic airport pier environment, and were asked to assess their perceptions of the width, amenities, comfort and so forth. VR allowed swift engagement in the complexity of systems of an airport, with members of the group even being able to start to plot out airport layouts themselves.

These early experiments led to the development of a wishlist for VR in the practice. These included; better design tools, integration with Rhino and Grasshopper, easy to customise interfaces, scene interactivity, live linking between applications, and some Augmented Reality.

Mixed Reality with AR

The journey is now towards the mixed reality world of AR. To explore this, Grimshaw hires a specialist games designer and started working with Fologram, thanks in part to its easy workflow from Rhino. They saw its immediate potential so implemented AR on quick review sessions, e.g, dynamically adjusting a stadium roof – a technology that is far quicker than the equivalent 3D printing process for design review. They have also tried AR at the masterplan scale, being able to see the impact of adjusting the volumes of buildings in relation to one another.  AR here has a distinct advantage over VR in that the ‘sunglasses’ style of headset means that the user can stay in the conversation taking place around the model.

Grimshaw have also been developing custom apps and engaging with video game platforms Unity and Unreal. However, this is time costly and requires coders in the team. But it does provide for a degree of photorealism and the animation of elements such as the doors on an underground train. The user finds themselves in much a more immersive place than before.

Grimshaw feel that they are still in the ‘humble beginnings’ of working with AR as an in-house technology. They are still exploring the tools and workflow, but it’s a priority for investment. The ‘holy grail’ is that of interoperability: can you connect Rhino and Grasshopper with video game engines? Well, that has been happening for a while, but what has gotten the team excited recently is the ability to run ‘Rhino Inside’, with the software being called from within other platforms.

Go-Rhino-Go

‘Go-Rhino-Go’ is an open source GitHub project that was developed at a hackathon in New York in conjunction with architects from Foster & Partners and others. It allows you to call Rhino and relevant libraries to permit the real–time building of geometry in rhino from the Unity interface, and combining these two worlds in a collaborative situation. It will never replace Rhino, but it’s an in–between sketching tool, with a really big potential which they want to explore further. There are certain limitations due to how Rhino is developed but they are in discussion with McNeel and Go-Rhino-Go is open source so they’re keen to see a community grow.

The advantage of game engines is that they have a certain power to narrate and to communicate complex messages within a simple frame of constraints — so it becomes less about where you do the calculations, but about what systems like Unity can give us. So, as a result Grimshaw have just welcomed a game developer to their team, which is a new breed in the world of AEC.


 

Lyudmil Vanev | Chaos Group | V-Ray Next for Rhino

Firstly, it will be smarter, so smart that it takes optimisation decisions for you. A new asset editor allows common libraries, which are stored where you want, not in V-Ray. It features a spline curve editor for value manipulation (eg; hue, saturation) and they have added metallic PBR style shaders. There’s a light editor, where you can set up lights in the editor without making test renders in scene, and a lighting heat-map analysis tool as well as new multi-matte elements for compositing.

V-Ray next also has two new patented algorithms governing scene intelligence.  

There’s an adaptive dome light that can use image based lighting. There’s no need for light portals any more, just use the dome. V-Ray Next now has auto exposure and white balance for scenes, so V-Ray can create perfect lighting for you, and it will also handle the difference between interior and exterior lighting.

Next has cut render time by 2 to 5 times, even up to 11 timesin some cases. Next is generally 20-50% faster for exterior scenes. And with GPU processing, up to 18 times faster (again in 3DS Max). The general message is that you can achieve more with less.

Denoising was good in version 3.6, but had only one algorithm. It was perfect for cleaning up the end result of a visualisation process, but what about faster workflow? So they have added a new denoiser using Nvidia AI, which is fitted with thousands of denoising patterns.

VRScan GPU

Chaos Group’s material scanning technology has been in development for 10 years. You can put any material sample inside and VRScan captures mathematical data of every single direction. Clients used to complain that programmed materials don’t look like their material samples and you end up spending weeks tweaking them and they’re still not happy. But with the scanner they look real.

V-Ray for Grasshopper

V-Ray allows you to render grasshopper without baking geometry. This leads to the ability to create animation in Grasshopper and render it directly, by having a V-Ray Scene node Grasshopper. You can also create materials in Rhino and manipulate them in Grasshopper. Grasshopper can also control the lighting, camera and sun and again create dynamic scenes all without baking.

Overall, these new items are about a tenth of what’s coming…

V-Ray’s VR and AR Pipeline

Using the vrscene transfer format is a great solution for taking work into Unreal. It does have one limitation — that everything should have be a texture as Unreal doesn’t accept procedural defintions. There’s also still the need to export, as there’s no live connection yet. But the V-Ray scene file does contain all the geometry, lighting etc and V-Ray for Unreal converts shaders, lighting etc., into to native Unreal definitions. In the Unreal settings, you can directly select V-Ray denoisers and other features, and V-Ray will bake all of the lighting within Unreal — you can even manipulate the bakes in Photoshop as they are not hidden away.

Project Lavinia

This is a new real-time ray-tracing engine viewing system, based on Nvidia DXR technology. It’s a drag and drop viewer for V-Ray scenes created in any V-Ray platform. It can handle scenes with billions of polygons without prebaking, or faked reflections. Where is it going? Will it be useful? Feedback to Chaos Group please! They have the alpha for 3DS Max out already, and Rhino is coming.


 

Long Nguyen | Research Associate at the University of Stuttgart | C# Scripting and Plug-in development for Rhino

Long teaches classes which start assuming no knowledge of C#, but during the course of the workshops the students learn it and get to develop their own plugins. He also shows algorithms for computational design, to achieve logic not possible within the visual parameters in Grasshopper alone. He also teaches good clean programming practices, to enable the creation of plugins that can be packaged and distributed commercially. Example use cases might include getting elements to obey rules, e.g. don’t self-intersect, or to study liquid erosion of a terrain. The next introductory classes with Simply Rhino will be in June.

Advanced classes coming soon.

In September, Long will also offer advanced versions of the workshop, for example parallel computation in C# for proximity checking or how to make a Grasshopper plugin to undertake heavy calculations in the background without freezing the main user interface.


 

Jonathan Rabagliati | Foster and Partners | The Bloomberg Ramp

The project for a grand ramp in the Bloomberg building in the City of London started 7/8 years ago, but has its roots in work done by the practice 20 years ago at the Reichstag in Berlin, and later at the GLA Building in London. There they had learned some of the tricks about how to create a minimal and smooth appearance while satisfying code requirements for level sections within the slope.

At Bloomberg, the attempt was to build a building with huge internal area but which respects the medieval street pattern. In heart of north zone of the plan, there is a huge triangular space and atrium with ramp that rotates as it passes each floor. It’s not just a conduit for people but it’s also a part of the ventilation strategy.

One of the challenges was how to the get clients’ head round what they were designing. They did 3D prints and presentation models and they did lots of renders. But the development process was necessarily complex — having designed the model parametrically in Grasshopper, every ‘frame’ in the animation of the ramp was its own Rhino surface model — so there could have been an infinite number of different ramps.

Jonathan is passionate about curvature and to use ellipses as the basis for the form disturbed him. The inherent tightening and loosening of the curvature was no good, and he wanted a more elegant solution. So he did a curvature plot of the acceleration and deceleration of the curvature and it revealed unwanted kinks. So he used an equation that is in fact just like a Spirograph: rolling one circle around another. The ratio between the gears on the moving wheel versus the circular frame creating a trochoid. And in turn you end up with the setting out of the ramp, with the skylight above being defined in a similar way.

There was then a dialogue to and fro with Adams Kara Taylor engineers to refine and simplify the geometry. The beauty was that you could pull out the structural model and plug it into his hypotrochoid model, that would then update the engineers’ model at same time. This process eliminated lags in coordination, but required common naming conventions and a shared language to make the collaborations work. Rather than wasting weeks on coordination they could get on with building the Grasshopper model and doing detailed analyses of load cases for all 96 steps and knock-on effects on all the other steps. It created a matrix of data that could be interrogated, and the efficiency then freed up engineering resource to undertake a far more in depth study than is usually done. Overall, it reduced the uncertainty factors due to greater clarity.

Full scale prototyping was very important for user comfort, and also to convince the district surveyor that the proposed gaps around a glass infill panel at the landings of the ramp would be safe.

In a combination of precision and brute force, they ended up with using a contractor for the bronze cladding panels who was based in Japan, with the substructure being created at Littlehampton Welding, with the elements coming together after a series of overnight deliveries into the City of London. For the contractors, they made a simple set of instructions listing the variables for each element and diagrams—which became a 96-page method statement of how to build it.

During the design process, Michael Bloomberg visited a mock-up, and being of lesser stature, he questioned the height of the balustrade and wanted it lowered, to which the senior partner at Fosters said ‘yes!’, not knowing the consequences. But through another equation, the team were able to find a solution. In January 2014, they made a pioneering use of the Oculus Rift in one of first projects to test different options, not just for client review. They had to find a way of smoothing the curves of the lowered balustrade but retain the setting out at the floor levels. To resolve this they had to introduce an s–curve to smooth shapes—but it had to do so imperceptibly. So they tried various s-curves to maximise the effect and used VR to see which looked best.

The whole development process took months with the Rhino geometry eventually transferring to a fabrication model produced in Catia within a tolerance of .004mm. Then came the amazing bit: yes, the fabricators had an accurate model, but did they build it right? And will cladding fit? To check this, they did three 1 billion point laser scans of the installed substructure. They then put that dataset in Rhino and tested against their design geometry and colour coded it for clash detection. The maximum deviation was 24mm over 6 or 7 floors, meaning there were a few areas where had to alter the geometry. But by this time the very beautiful and very expensive Japanese bronze cladding was landing at Tilbury Docks and couldn’t be changed. So the adjustment process was to combine the scan and the solid model to create a virtual model where they could ‘jiggle’ the bronze panels using the minimum shift distributed across them all, and set up rules of how periodic 10mm shadow gaps could be tweaked accordingly.

Although they were using metrology with sub millimetre precision, in the end the specification writing was key. The spec just called for ‘a smooth continuous curve’ — just a few words as opposed to all that data..! And you find yourself reverting back to written definitions, such as “plus or minus 2mm” and end up arguing on site when the contractor points out that this actually allows for 4mm of mis-alignment, because its plus 2mm on one panel and minus 2mm on the next. The moral here is that for all the computational sophistication, don’t disregard the specs!

It’s all very well designing or making things with these tools, but the process of actually realising something like the Bloomberg Ramp is just as fundamental and crucial. And don’t lose sight of the fact that the end result is about simple human interactions: the ramp enables casual interactions and conversations to take place. And one final nice reward was that the plan of the ramp was adopted as a logo for the building.


 

Gwyllim Jahn | Fologram | Making in Mixed Reality

Fologram are building software for mixed reality devices, so designers can use them for design and making. They’re interested in how you go from design packages to making things in the real world, without 2D drawings.

The AR technology was originally always about helping fabrication and comes from work done by two Boeing engineers to enable accurate placement of systems within an airframe under construction. Now it can still be used for precise registration, but also for shared experiences and to build natural, intuitive interfaces.

Fologram work with the Microsoft Hololens, which offers precise tracking, but the downside is the need to develop in Unity. So they have made a bridge from Rhino and other platforms.

Their target is to produce reductions in time and cost risk in experimental architecture. A case in point is Frank Gehry’s Dr. Chau Chak Wing Building at the University of Technology, Sydney. There, the undulating curved brickwork façade had to be installed by an expert team with painstaking precision, meaning that a bricklayer who was used to laying 400+ bricks per day was down to 80 bricks a day.

There was a clear need for a way to make the process simpler and faster, and avoid the need that Gehry’s had of providing setting-out information for every single brick. And to find a way to be able to use less skilled employees, not just master bricklayers. And for them to be able to work in parallel. So Fologram did a small test build of a sinuous brick façade, using local ‘brickies’, where they were able to build in one day what would otherwise have taken weeks, because each of the crew could see a projected hologram of exactly where the brick should be. The brickies themselves were super-excited as they could use less skilled labour alongside masters leading to better fees, a faster installation and a better result for the architects.

Fologram also work with art fabricators, who can use virtual templates to rapidly develop work as they go, without a steep learning curve.

It’s a case of using old tools for new tricks. Can we rethink old design tools? Now you can stream a model to multiple devices so can have collaborate modelling without cad skills. With Fologram you can have three people work on one Rhino document simultaneously, just using three iPhones.

A classic test are three-dimensional Voronoi diagrams; can these be done quicker using these tools? Now you can combine the precision of digital modelling with the ability to overlay analog tools, all without 3D printing. They overlaid a Voronoi hologram from Grasshopper into the workshop of a Chinese fabricator, who just had to follow the hologram and bend the metal components till everything is just right. You can even then use Fologram to augment the physical object with AR animated elements, like a breathing skin.

And the system is very lightweight, with the ability even using a laptop to combine a live 3D scan of a space with Rhino models and interact with it using an iPhone. All of which can be done anywhere in world with just a WiFi LAN and a phone hotspot.

There’s a free mobile Fologram app available from their website and they are about to debut exciting new developments following the launch of the HoloLens 2.


 

Next Meeting! Our next Grasshopper User Group Meeting takes place in Manchester on Thursday April 4th 2019 at Arup. See here for all the details on the presenters and how to book your place.

The post Grasshopper and AR/VR for Rhino UGM – February 2019 appeared first on Rhino 3D.

]]>
Formlabs Form 2 User Group Meeting – April 17th, London https://rhino3d.co.uk/events/formlabs-form-2-user-group-meeting-april-london/ Thu, 22 Mar 2018 17:40:38 +0000 https://www.rhino3d.co.uk/?p=922 Simply Rhino invite you to join us at Digits2Widgets for the first Formlabs Form 2 User Group Meeting. When: Tuesday, April 17th, 2018 Where: Digits2Widgets, 61-63 Rochester Place, London, NW1 […]

The post Formlabs Form 2 User Group Meeting – April 17th, London appeared first on Rhino 3D.

]]>
Simply Rhino invite you to join us at Digits2Widgets for the first Formlabs Form 2 User Group Meeting.

Eventbrite - Formlabs Form 2 User Group Meeting

Form 2 3D Printer working within its eco-system

The group is for those who are interested in meeting in order to network, discuss and explore opportunities the Form 2 SLA 3D printer brings.

This informal user group meeting is suitable for those completely new to the Form 2 3D printer and those already familiar and/or working with the Form 2.

The meeting will consist of at least one short presentation from a Form 2 user, followed by group discussion and informal pleasantries.

Of course, there will also be the chance to see the Form 2 eco-system at work


 

If you’d like to join us for the meeting then you’ll need to register for your place here.

Eventbrite - Formlabs Form 2 User Group Meeting


 

Presentations will be from:

Jimmy Littmann – Formlabs / Jimmy is the Formlabs Country Manager for UK & Ireland. Jimmy will present the latest Form 2 news from Formlabs.

Bart Radecki – Digits2Widgets / Bart is the CAD Production Manager at Digits2Widgets. Bart’s presentation will be of interest to those new to 3D printing and those already utilising the technology within their workflow. The presentation will look at a range of projects that utilised the Form 2, from fast paced medial projects, beautifully finished prototypes, to precision jewellery.

 


 

The post Formlabs Form 2 User Group Meeting – April 17th, London appeared first on Rhino 3D.

]]>
AR/VR for Rhino UGM – February 2018 https://rhino3d.co.uk/ar-vr/ar-vr-rhino-ugm-february-2018/ Mon, 29 Jan 2018 16:23:17 +0000 http://rhino3d.wpengine.com/?p=789 This is the second meeting of our AR/VR for Rhino User Group. The group is for those who are interested in meeting in order to network, discuss and explore virtual […]

The post AR/VR for Rhino UGM – February 2018 appeared first on Rhino 3D.

]]>
This is the second meeting of our AR/VR for Rhino User Group.

The group is for those who are interested in meeting in order to network, discuss and explore virtual and augmented reality solutions for Rhino.

The meetings follow a simple format of at least one presentation from a customer with experience in this field, followed by group discussion and informal pleasantries.

Details

  • Thursday 22nd February 2018
  • 18:30 – 20:30
  • Heatherwick Studio, 356-364 Gray’s Inn Road, London, WC1X 8BH (See map here)

If you’d like to join us for the meeting then you’ll need to register here.


 

Presenting at the meeting:

Heatherwick Studio Logo

Heatherwick Studio started using VR and the interactive models several years ago with the first release of Oculus Rift developers’ kit. Since then we have developed several workflows how to use VR not only for the client presentations, but directly in the design process.

Heatherwick Studio VR Image

Designers are able to explore their designs from simple static VR images rendered directly in Rhino to complex fully interactive models imported to Unreal Engine.

Heatherwick Studio VR Image

We have also started exploring new ways of 3D sketching using Tilt Brush and scanning physical models with CT scans and exploring them in VR.

Heatherwick Studio VR Image

This presentation will demonstrate several workflows we have developed to review our Rhino models in VR environment on selected projects.

Heatherwick Studio VR Image

Images courtesy of Heatherwick Studio.


 

Also joining us are:


Developers of the MΛSSLΞSS Pen, a precise and intuitive VR tool for great 3D designs.

John from MΛSSLΞSS will give a short presentation of the MΛSSLΞSS Pen for XR Design and will also demo the current system.

MΛSSLΞSS Pen for XR

 

Integration with Vive, Oculus, Unity, and Unreal

 

Use in 2D like a mouse or graphics tablet

 

Contact Jonn /  jonn@massless.io


Thanks to Heatherwick for kindly hosting this AR/VR for Rhino UGM.

 

The post AR/VR for Rhino UGM – February 2018 appeared first on Rhino 3D.

]]>