AU Class
AU Class
class - AU

Crossing the Bridge: Mott MacDonald’s Digital Transformation

이 강의 공유하기

설명

Do you want to learn how to deliver bridge projects using an intelligent model-based approach? Then this class is for you! The transition from 2D design through to 3D parametric model-based delivery for bridges remains a challenge for many consultants, where moving from traditional approaches is considered either too challenging or the benefits for building information modeling (BIM) delivery will not be realized. This class will show the key steps our team has taken throughout the last five years using Autodesk software to model, design, analyze, and document various types of bridges and viaducts. These steps demonstrate our evolution toward intelligent, model-based delivery. We’ll share the many challenges, pitfalls, and lessons learned we have encountered and gained along our journey working in various countries and stages of delivery. Attendees will gain valuable guidance needed to prepare their team, project disciplines, and clients to pilot new approaches for bridge design and documentation—and beyond to full BIM-based delivery.

주요 학습

  • Discover common challenges with bridge modeling workflows and learn how to overcome them.
  • Learn how to implement efficient workflows using Revit, Dynamo, and Civil 3D to model and document linear infrastructure.
  • Learn about the new workflow developments in the bridge design and analysis.
  • Discover how to start your path for an intelligent-model-based approach for bridges.

발표자

Video Player is loading.
Current Time 0:00
Duration 1:12:44
Loaded: 0.23%
Stream Type LIVE
Remaining Time 1:12:44
 
1x
  • Chapters
  • descriptions off, selected
  • en (Main), selected
Transcript

IGOR VARAGILAL: Hi. Hello, everyone. Today, me, Igor Varagilal, together with my colleague, Paul Briedis, will be presenting on crossing the bridge Mott MacDonald's digital transformation. As I said, my name is Igor Varagilal.

I've been working with Mott MacDonald for around four years now. My background is as a structural engineer. But I've been working with bridges and civil structures modeling and automation. And I'm based in Prague in the Czech Republic.

Hi, everybody. My name is Paul Briedis. I've been working with Mott MacDonald for about 18 years. I started out as a highway engineer and more recently, I've been working as a digital lead, supporting our transportation projects across the globe. And I'm also based in Prague in the Czech Republic.

IGOR VARAGILAL: Why do we think you should stay in the room? What's in it for you? If you are a client or an asset owner, we believe that we have developed our capabilities using the software that we are going to show and the workflows, we developed a couple of good workflows as well. And this is all to meet the client requirements.

And a big advantage as well is that this improves our ability to respond to change. That could be the highway, driven by highway or railway or by engineered changes. The importance of this is that we can get value by meeting our client and by our client information requirements.

If you are a product user, if you are the delivery team, we believe that there are different solutions for different problems and that no two projects are the same. And we are going to show you our digital transformation from 2D to a parametric and information-rich model delivery.

We want to show you our mistakes, and we want that by the end of the presentation, you will be able to learn from it. During our presentation, the bridge engineering details will not be covered. And what we are also going to show you is a sample of the bridge projects that we have delivered Mott MacDonald. So it's a sample of the projects that we worked on, on a large multi-discipline project.

So in the last five years, we've been working on tender and detail designs projects. And we wanted to move from a 2D AutoCAD approach to information-rich 3D parametric delivery. And to do that, we found that this process takes time and effort to achieve. And it also calls for different range of solutions and software, a collection of tools. And we started that process using a Revit Dynamo Civil 3D and Navisworks.

Also, as our information management platforms, we used ProjectWise and BIM 360, but those were mainly dependent on the client and their requirements. We also worked a bit with Inventor and InfraWorks and more recently, we've been working with Rhino and Grasshopper and exploring their power to export this geometry to Revit.

The idea here is that we wanted-- or one of the main reasons or the outcomes of this process is that our team, the capabilities of our team, was steadily growing. And we wanted to do this transformation. We wanted to go on this journey because we wanted to embed the digital delivery in the way that we work.

We wanted to be the ones driving innovation and improve the efficiency of our workflows. And one of the more basic reasons that we wanted to do better, we wanted to deliver more by doing less. As I said also before, we hope that by the end of this presentation, this will benefit you and your team to forecast and plan the improve of your project delivery.

PAUL BRIEDIS: Let's have a look at the journey we've taken over the last five years. For today's presentation, we've selected four projects to go through, the first of which is a collection of rail projects in Australia. They were some detail design projects. The second was a major highways project in the UK. Again, a detail design project.

The third one, which is a tender project, again back in Australia. This was a milestone project for us, with the program constraints and the volume of work we had to generate and complete in that time. So that was a huge learning curve for us.

And the last one we're going to finish off is a project which we're currently working on. Again, it's a major highways project, detail design in the UK. And this is where we are adopting a quite advanced workflow in terms of the bridge delivery, connecting the analytical and the graphical models to further push the efficiency in our ability to deliver these projects.

So we're going to look at the key challenges both then and now. So from five years ago up until today. So a lot of these challenges will be familiar to those listening in. And we've broken them down into three categories, the first of which delivery. So what are our challenges? Our client requirements. The project requirements.

We all know this is the main driver and one of the biggest challenge that we often face with delivering our projects. And this is what pushes us in the way, a lot of the times in the way that we want to go. And we're always trying to stay one step ahead to have those capabilities to meet those requirements, if and when they come around.

Responding to changes, Igor mentioned. When anyone who's worked on a bridge project, being able to respond to a change in geometry from a highway or a rail geometry is a fundamental aspect of the way we're delivering projects. As we know, the geometry of a bridge could change twice in one project or it can change 10, 15 times, some of them large, some of them small. And that's a really fundamental part of how we plan the more technical model production and model delivery side of our work.

The quality and the consistency. So we, like any other engineering firm and consultants who's delivering work, we want consistency and quality across our outputs. And this is no different from what we want to achieve as well.

Single source of truth. We all know about this as well. And again, we're constantly trying to have our model production and now we're touching on this latest project, the analytical, as a single source of truth. So that's a big challenge for us to try and achieve.

Let's go into another, the second category. It's the technology. So the capabilities of our team and those who are working on our projects. This is one of the big factors. Every time we work on a project, we don't always have the A-team. We don't always have those expert, elite users of Revit and Civil 3D and whatever it may be to work on these projects. We have a combination.

Some people have 10 years' experience and can do a lot of amazing things. And they'll be partnered with a couple of graduates who are just learning and eager to get involved. And that's something that always becomes a factor for us. And it sort of reflects the real-world challenges that we all face.

The limitations of this technology. Well, this is why we're here at AU. We're learning about where last year's limitations are no longer, and this is what we're looking forward to next year. So what the software was capable of doing five years ago is different from where it is now. And it'll be different again in five years' time, of course.

So these are the challenges that we have to work with. The information management. Your BIM 360s, your ProjectWises, your [INAUDIBLE], whatever it may be, these are the challenges that we have to work with. And we're talking about on these project examples large-scale multi-discipline projects. So managing big volumes of data, big teams across continents and time zones.

This is a major challenge for us. And I suppose just to highlight the point here, getting the right information to the right person at the right time is fundamental to everything we do. And it remains a significant challenge for those members of the team who are trying to ensure that that is indeed taking place. So that's a key challenge for us obviously.

Lastly, winning hearts and minds. So we started off our journey five years ago from a 2D delivery, and we all started there at some point. And some people may still be there. We want to transition. We wanted to transition to a more intelligent way of delivering projects and a team of people who technologically have the skills to use a software and have some, let's say, forward thinking and a bit more of a vision.

There are always those parts of the team that are not involved in this and maybe sort of part of the pure engineering or some other aspect of the delivery. And we can't progress in a more advanced or efficient way or we're limited by how fast we can progress if we don't have everyone convinced and believing that the new ways of working and the doing more for less kind of approach that we're all trying to strive for is everyone's onboard on that. We need that.

It's not just a pretty looking 3D model. How many times when you started working in the BIM or the digital world that that was something that was said to you when you've created something and you want to share it with an engineer. That's some of the things that we're challenged with.

And this is an evolution, not a revolution. This takes time. So what we're going through here, this is five years, I've said, over 4 projects that we've selected. This is a gradual step change as we go and learning from our mistakes. This doesn't happen overnight.

I said before, your challenges could be different. But a lot of these would be common to you as well. So hopefully, setting this scene. We're going to get on to the projects. We can sort of start to have a common way of evaluating how these different aspects list on this slide sort of touched on and how we've tried to overcome these challenges.

So we'll get on to the first project. So collection of rail projects in Australia. So this was a set of detail design projects for a major railway upgrade. Within our scope was a design and documentation of two road in three rail bridges.

The client information requirements in this case, we had to hand over a native Revit model. And the level of detail was three with some minor attribute information. So fairly standard kind of delivery, let's say.

This was our first bridge project for our team delivered in Revit. We of course, had done Revit projects prior to that in other practices and other different assets we were working on. But this was our first one done in Revit.

So the software versions we're working with, we want to touch on this just so we can sort of just get people thinking about the limitations or the challenges we're working with. This was Revit 2017. We had a bit of Dynamo with this. And the geometry of our road and rail was pulled together using Bentley RailTrack and Bentley InRoads. Igor.

IGOR VARAGILAL: Thank you, Paul. What did we do on this project? As I said also before, we started by using the technologies we had available, and this was Revit and Dynamo. A bit of Dynamo to create our models, and we also used Navisworks to visualize them and to just check the information.

Here's an example of one of the road bridges. And what we were trying to do is to understand the time and effort that it would take us to create the model. And we also wanted to understand what value does that bring.

We viewed it in between parentheses, that convince part, we wanted to-- because this was our first project, we wanted to convince ourselves that there was a value to put this time and this effort into building this Revit model.

Then, we also worked with the drawing production. In here, we can see an example of a traditional 2D approach of a drawing in AutoCAD versus a Revit output that we had from our models. We see that it's quite different. But this is exactly what we wanted to do.

We wanted to understand what was the time and the effort that we needed to input to create the same output, to create the drawing that would be industry accepted with their standards.

And we wanted to understand what will be the value that we would get out of putting that time and effort as well. So again, we have in between parentheses the convince. We wanted to convince ourselves and the engineering team, the team that was working with us in the project, that there was value in using this workflow in Revit.

PAUL BRIEDIS: So just touch on the challenges here. So again, after each project, we can go through the challenges and the lessons learned. So limited resources available. First project, only a handful of people in our team that had the skills. So that was our first challenge.

The drawing production. So yes, that was the challenge. But that was exactly what we were doing, investing time and effort in parallel with the traditional 2D delivery of those drawings, to do this again, and we'll see how far we could get using that Revit model.

Again, the value from the model, the challenge. So when we produced this model, it wasn't really sort of taken on board as being sort of a key deliverable. How do we then get the most out of it? It was more just like that's just the output, a bit of that's a nice 3D-looking model, yeah.

The engagement with the engineering team. It's not as though there weren't limited engagement, per se, but it's the value as to the production of this model gave that-- the generation model that gave the engineering team was somewhat limited. So that was our challenge we were trying to overcome. So putting it in the front of mind and trying to make this as a normal way of working was a challenge for us.

The lessons learned. I'll sort of continue on this. The parallel delivery with traditional 2D delivery of the drawing production, the design, that was a really good approach for us because we didn't want to risk and go down the path of, yes, we went. We can create from single source of truth approach the drawings from the Revit model. So we had that sort of fallback position of the traditional delivery was going to still be there.

Early planning and engagement. So what are we going to do for next time? Well, yeah, we need to engage a lot more with the team, early planning. What type of geometry you're going to get from the rail and the road geometry, so from BRT and InRoads, what are we going to get from them? How do we use that information to generate our models?

Expanding our model capabilities. Of course, we saw what we had to do. We saw the effort, and that means, right, we've got to go and invest time and effort to our capabilities. And I think everyone comes across this issue, the better Dynamo organization. Whilst the Dynamo scripts did their work, they were pretty messy. So we've all come across this, I'm sure.

And last one, Understanding the time and effort and value going into modeling. So that's our first project. And now, that just sets the scene as to where we started. Now, we're going to build up sort of the complexity and the challenges that we had there on.

I'll introduce this project now, and then we'll go back over to Igor. Major highway project number 1. So a quick rundown of the project itself. Detail design, several structural types. LOD4 and minor attribute information. So we're now getting a more complex, more detailed model. So that means we had to up our game in terms of Revit, the use of Revit and our capabilities.

We're working with a digitally mature client and for a lot of people, I think that's actually where there's a struggle or barrier perhaps. So when we're not sort of pulled upwards from our client to progress in advance and think about intelligent ways of working, it can be sometimes a bit inhibiting to our growth. And we were very fortunate enough to have this digitally mature client to continually challenge and push us and try and get the most out of this investment.

Revit 2020 Dynamo and same with Civil 3D was what's being used Igor. Oh, no. I'll just finish this. Here's just a couple of slides of the models at the end of the day, which we produced, some rendered images from Navisworks. And yeah, this just represents both the complexity and the form that we had.

So there's a steel footbridge in there and a couple of other steel girder bridges. And yeah, so all in all, that was the delivery over about one and 1/2 to two years. So Igor.

IGOR VARAGILAL: To achieve those beautiful renders that we just saw in Navisworks, we had to model it in Revit initially. And we did it by breaking down our model. So we could have modeled the bridge all in one go. But also due to the client requirements, as Paul said in the beginning, that we had the LOD400, and we needed to have some kind of specification for our elements and to break them down element by element.

So we did that by placing a couple of the elements that were mainly driven by the highway's input, the strings that would get from the highways input. We modeled those using Dynamo. And a couple of examples of those elements are the deck, the barriers, and the girders.

As for the piers, the piles, or the pile caps, those were modeled manually because we also believed that there was no need to have a Dynamo script to place these elements. They were changing, yes, but we could easily model them manually in Revit. And we would have the same value in doing so.

A special attention here is also to the levels. This was also a client request to have each of those elements assigned to a specific level. So you could see here there is an abutment level. There's a pier level. So those elements that are assigned to that level are also the pier and the abutments. So that would be easier for the client and for us to see these elements in Navisworks.

So this all started because our key challenge, and as we said also in the beginning, this was the main challenge that we faced. And then for sure, a lot of you here in audience will face it as well. We want to respond as quick as possible to the highway geometry changing. And we believe that we created a workflow that allows for this, that we were able to allow for this quick response for these changes of the highway geometry.

To do so, not only we needed to have the workflow set up like this, but our scripts that we were using, we believe that we have to keep them as small as possible and as easy because if they are small, they are going to be easy to debug. This was also a project where we started to have more people, more resources using Revit and Dynamo. So in order to do so, we wanted to keep them as small and easy to debug as possible.

And one thing that I didn't mention, but we also have a Content Library for modeling these bridges. So the piles and the piers that you see here. But the big problem is that that Content Library was not accessible. We wanted to have that Content Library as accessible as possible.

And we found that if we needed to click or spend at least two minutes to get that content, it was a challenge. It was a problem. So this was a lesson learned that we will touch on, that we needed to have that value. We needed to have that Content Library more accessible to our users.

Very important as well. We broke down the model, but in certain elements, as we can see here as the abutment that was modeled in place, we didn't model or we didn't break down exactly as it should have, let's say, like that.

And this is a very good example to understand the why and the how we are modeling. So it's a very good example to understand the downstream, to understand the construct stability in this case. We're talking with our contractor, engaging with our contractor. We could understand that maybe we needed to model this element, this abutment a bit differently with the wing walls and with the wing wall bases. The foundations of the wing was separate.

Another thing that we did in this project was the drawings. As Paul said in the beginning, this was our first fully delivered in Revit detail design. So we actually had to produce the drawings.

As we all know, maybe Revit is not the best tool for linear infrastructure. But we still felt like we were in a good position to try with the skills that we had and that the growth that we were-- the fact that we were going with our team, we wanted to try and achieve a product or the drawing output that would be accepted.

But it cannot be understated how difficult and how time consuming this process is. Yes, we have drawings here that look good and that was our output. But it was very, very complicated to achieve this. And as you all know, a couple of challenges that you face when you are creating these drawings, if our bridge is curved, both in a plan or in an elevation, it's a challenge to create a long section, to create an elevation of that bridge along that curved alignment.

If we sometimes annotate or we create some text notes for some of the elements, if the bridge after moves position or if there are some changes, we need to move those annotations. We need to adjust it. And also, if we want to annotate to some of these elements that are not perpendicular to that view, it can also be very complicated.

So these are just a couple of the challenges that we faced creating these drawings. I'm sure that you all have different challenges when you are trying to do these drawings. But very importantly as well, during this process and after, we found out the importance of having a very consistent and robust Revit template.

This is our starting point for our Revit models but also for our drawing production. So we needed to have a Revit template. And in this case, in the beginning of this project, we did have one. But the more we used it and the more we broke it down, we realized that there were some things that we needed to improve. And we used this experience as a building block for the next project so that we wanted to achieve consistency in our drawings.

And we wanted to have a final product that is consistent across all of the bridges, all of the models that we are working. And this sometimes can be difficult to achieve in a major project when we have multiple people working at the same time.

Just on the drawing production as well, what we did also was we developed a Dynamo script that was highlighting the elements that were modeled in a 3D environment. So these are the elements that we see here in red.

And we did this because we believed that this gave us a lot of value when we were looking into these drawings, not only for us, but also for the engineering team when they were looking into our drawings in Revit. And a good example here is that if you see this typical Section 1, there's the bracing elements. They were not modeled in 3D. And you can see here that they're not highlighted in red. And this is very obvious for some people of this sentence.

But the more elements you model in your model, let's say in your Revit model, the more elements you included, the more detail you're going to have. So then the easier the drawing production becomes. So you don't have to create details, 2D details, to insert this into your drawings.

And very importantly, and this is a sentence that we sometimes say, is that the model doesn't lie. Like, what we get in the model is what really is there. But to achieve this product, sometimes we need to use a couple of tips and tricks to get to this final product. So the model doesn't lie, but for us to get to that final product, we do need to use a couple of tips and tricks.

PAUL BRIEDIS: So we're going to now touch on something else that we-- the first time we trialed something for our team on this project, and this was the 3D RC detailing and drawing production. So we were using-- so we'll just point out that this was a Revit 2020 project. And the tools we had available to us within that release, we really found it challenging with the geometry that we had in our models to be able to get that RC detailing working.

So getting all the geometry, the conditions of the RC caging and so on was very challenging for us to achieve in the decks and the abutments and other elements, just because we weren't working with just nice linear shapes, but we're working with, say, bridges that had a super-elevation on it.

So the deck is sort of warping as it goes along the alignment. And this proved to be very challenging for us. We invested the time and effort to do this, to find out what value we would get from it. And whilst we had the Revit technicians and the engineers working as part of this trial quite closely together, there was a lot of benefits.

Obviously, the visual side of things, to be able to understand what it looks like and how we can interrogate this, not just in the drawing. So there was benefits. But the downsides were just, again, the geometry from our structure model itself was really-- it wasn't working certainly as cleanly as we needed it to be for that to be an effective way of delivering this output.

So we learned a lot of lessons here. And I'm sure there's a lot of people who have come across or come to the same conclusion or a similar conclusion with a bridge or sort of an asset which is of complex geometry.

Just I'll pop back just quickly on this one. It's worth noting as well, Revit 2023 has had a lot of improvements, or work done rather, into the features for RC detailing and documentation and so forth. And we're currently working on-- so doing an evaluation piece of what we went through using Revit 2020 on geometry like this and then doing the same geometry but using Revit 2023 because we always have to be looking back to where the blockers were in the past and where the software functionality then improves, which can unlock some of those or sort of remove those barriers. So Revit 2023, some of you may be looking at this as part of this AU. It is something that we're keen to explore further on.

Another little quick one. Pretty easy. So this was about how do we connect the highways team with the-- or the highways tin surfaces to generate in Civil 3D to the bridges team with their models complete in Revit. So some of you probably well and truly know about this, but some of you may not.

There is a published surfaces function within Civil 3D where you grab your surface, you publish that surface to BIM 360. And then from there, when you're working with that person who's on the-- a Revit user who's on the same project, they can then import that topography and have that linked directly to their Revit model.

And this is just another thing that we needed to put in place to get the coordination and the information transfer to the right people at the right time. And this was a really good workflow to make sure that was happening.

Does this solve the problems of the geometry changing with the highway and the bridge? No. But it did give us a really good connection and clean connection for not just the interior in the model but the drawing production. So yeah, the final design and existing topography model was referenced cleanly, accurately in the Revit model. So that was a good little trick that we applied. And yeah, saved a bit of time, and it certainly helped us.

Lastly, so one of the things that we tested as part of the time working on this project was Civil Connection. And this is something that's been around for a couple of years now. Here, I've got a video.

So what we've got here. So Civil Connection is all about getting-- I've got the Civil 3D model on the left, a corridor model. And then using the specific Civil Connection nodes in Revit, getting a generic model then are generated in Revit to represent that subassembly, so that sort of portion of the Civil 3D corridor.

And you can see here in the demo, and this is all set up, and it's nice and sort of it's all working. Whilst this worked, and rather it works, we sort of struggled a bit to try and use this on the live project because it has a couple of conditions which have sort of quite obvious that the Civil 3D corridor needs to be nice and clean and complete. And it needs to be showing everything we need.

But also, we found it sort of challenging to-- like, stability issues were a bit of an issue for us. And sometimes, like, one of the big things is the Civil 3D corridor, the team they had, it doesn't take long for anyone who uses Civil 3D, it doesn't take long for a Civil 3D corridor model to get really quite large.

So when you're having all that information in one model, it's hard to get that connection. So a bit of a challenge for us. But that was a good lessons learned. And I suppose that was always trying to challenge the way we were using, as Igor mentioned, breaking down the model into the separate elements and having those Dynamo scripts in the Revit, creating those different elements, the decks, the girders, and all that kind of stuff. So two ways to get to the same point. But we sort of, I think, still preferred the way we were originally working. Igor, do you go-- go through the challenges, Igor.

IGOR VARAGILAL: OK. The challenge. The first one, as we mentioned, and we talked about it, it was the drawing production and the reinforcement detailing. Yes, we did it. But yes, as we said, there was a lot of challenges. And for this complex geometry, there's still a lot of tips and tricks that we need to use to achieve that final product.

The next one was to model the complex geometry. There is a couple of elements like the abutments or some decks or girders that were a bit more difficult to model. And although we could achieve that in Revit, it was sometimes difficult to get that or in some examples like the abutments, where they are skewed, and they are following different planes, it was very difficult sometimes to model that in Revit.

Well, to do this, this was a very Dynamo-heavy workflow. So to do this work for you, you need to have expertise. And it was essential to have that expertise. And we also used this as a challenge in the beginning, but we also used this as a lesson learned or we used this as an opportunity to upskill the team that we have here with their Dynamo capabilities.

And then, as we said also, we needed to have consistency between the Revit families that we were using. Because we didn't have that Content Library, sometimes that consistency was just not there.

PAUL BRIEDIS: So looking at the Dynamo scripts, the lessons learned. Those Dynamo scripts. So keeping that small and easy to debug and easier to use for those sort of more junior staff members was really important. We wanted to give these staff the tools to do their job so they weren't reliant on the more experienced team members.

And again, in our experience, that sort of raised the bar on-- sorry. Excuse me. At the minimum, sort of lower capabilities and where we're trying to grow team members, this certainly enabled that to take place. So the challenges we didn't have available to the team easily accessible, that Content Library. And that's a lesson learned. We really need this library to get efficiency and quality and consistency.

Model breakdown constructability, as Igor mentioned. That constructability and understanding what's happening to the model once it's generated. You start thinking differently about, well, how do I generate it, and what information do I need to include in that, and so on? That was a good lessons learned for us. And that really highlights the importance of-- it may seem very sort of fundamental, but ensuring that we understand what our client or our contractor client needs or wants from this model.

And understanding that early is really what we all need. For instance, it's not just looking at a scope and delivering it. But diving a bit deeper can really enhance the way we work and have that better working relationship.

Published surface is very helpful. And yeah, so this is another one we haven't touched on before. Getting the Revit team who are generating these models, getting the fundamentals of Civil 3D is actually really-- yeah, it's quite important because one of the things we found is, say, a highway geometry will be updated.

And OK, there's that model over there sitting within Civil 3D, and we need to extract the feature lines as an example. Well, that is a very-- for anyone who uses Civil 3D, extracting fetch lines from a corridor is a really straightforward process, but unless you know how to do it, you're relying on someone else to go and give you that information. And whilst you've got the capabilities there within the team, we want to avoid situations where those highways users and so on are not available, I'll have to give that to you in a couple of days.

Well, no. Let's go and do it ourselves in a managed way so we know we're working with the right information and so on. But that kind of thing was sort of a bit of an eye opener that it wasn't just Revit that staff needed to know. They need to know the basics of Civil 3D to keep them moving.

Right. Project number 3, rail project in Australia. So this was a tender project delivered as part of a DJV. 130 kilometers in length for this project. So our scope was the design and documentation of 68 road bridges and rail bridges and viaducts. And the total length of all of those assets were 15.5 kilometers. Like any tender project that we've worked on, it was for anyone-- the program was very challenging, and this was very challenging.

This was done during COVID, the height of COVID. So we're all working at home. And that was very, very, very, very challenging. So again, we've got a bit of Revit 2020 Dynamo. And for this project, we had our rail and road geometry generated in 12D, which is a very common rail and highway platform within Australia and other countries.

A little bit more about the project itself. So this rail, this alignment, it's a tender project. The rail line was continually changing. When we say changing, we're trying to get the optimized alignment for this length of track. And for 130 kilometers that is very challenging, of course.

To get some sort of metrics as the scale of the change, so the responding to change piece that we said is a challenge, this was nowhere more apparent in any time, any project worked on with this one. So of those 68 structures we worked on, approximately 50% of those had to be updated every week and, yeah, full stop. That's a major challenge for us.

So for that reason, not knowing actually that it was going to be that challenging we still-- it wasn't going to-- not knowing there was going to be that many structures that had to be changed per week, we knew that we needed an innovative approach to respond to change. And also, we needed to develop an innovative approach to extracting the quantities of each of these structures.

And so we'll go through that. So we've just got a quick little fly through on the federated model. And so this is just one portion of the road. So we're just looking here at a couple of the viaducts we have. We've got lots of cut and fill areas, which are sort of highlighted as well. And we're just trying to give you an idea of what we're working with here.

So the geometry itself, I think it was LOD 3, I believe, for this project. And the scale. So the 15.5 kilometers we mentioned, this is one of the key points here. So this is really, really a unique project for us and something which presented some very, very significant challenges to overcome, both from working with our engineers and using the software and using the technology. Over to you, Igor.

IGOR VARAGILAL: Thank you, Paul. So to achieve that, our first step was to define the substructure and the superstructure elements, the geometry itself. And this was defined in the matrix in Excel, and in that matrix, we had the attributes for these elements, the substructure and the superstructure elements. And these elements were some dimensions and some graphical and nongraphical attributes.

The first step was to define a code for that substructure. This was all driven-- it's important to say that this was all driven by the engineering team. They were the ones owning this matrix, this Excel spreadsheet. There was a specific code that was assigned for each of these substructure elements we see here. And what we did is that we created a Revit families with a Revit family type, with the same code as this Excel spreadsheet.

As I said, these graphical/nongraphical attributes were defined in the same matrix. We have examples of the pile diameter, the pile spacing, some for the columns, and for the add stocks, and there is more attributes that were not here shown because the matrix was, in fact, very, very big.

And also, some nongraphical information such as the reinforcement ratio or the concrete class for those elements. And very important about this matrix is that this was being updated several times per week. And because of that, and this is something that Paul mentioned, and I would like to stress, because of that, we needed to be really flexible and agile to respond to these changes of this matrix, of this Excel spreadsheet to reflect those changes in our Revit families.

The second step was to define those families, to create those components in Revit. We have here a little video of what we commonly call the Revit Master Model. This is where we had all of our Revit families. And during the lifecycle of the project, if a model needed to get a new family, this is where they will have access to those Revit families.

This is also where we would, with the Dynamo script that we created, we read that first matrix that we had initially shown with Dynamo. And those updates on that matrix, on that first matrix, would be represented.

So if, for example, column diameter would change from 2 meters to 2.5, in a click of a button, we will get that change reflected in our master model. So then, the modeler would have confidence that it was getting the most updated family. It's also very important to note that although I'm only showing the Revit families here, but in the master model, this is also where the important information for our project was living. So any update to the title block or to the textiles or line styles that we would need to create our drawings, this is where the modeler would get those changes.

So the Revit Master Model was, in fact, serving as a live Revit template. It started from the Revit template that we had, also a Revit template that was, as I said before from the previous project, it was improved from what we had before. But then, what we did here is that we started this Revit Master Model with that template, and we were changing that live. And we had that flexibility.

Well, the next step is actually to build our structure. Our first input was the rail design string. So this is what we would get from the railway team. And we will also get the same design string, but it was draped to the topography. And this is where we would read from our Excel spread-- from our Dynamo scripts.

Thanks again to the engineering team that was very engaged, and it was driving us to do more and better. Again, in an Excel spreadsheet, we would have the definition of the structure. We would have the elements of the structure defined there. So there was a couple of manual inputs that we had to have because the input from the railway team was not always the same.

So we needed to define sometimes the offsets from where the bridge started, compared to where the D double-G string was starting and where it was ending as well. But that's one of the inputs that we had to have. And the next thing was actually to run the Dynamo script and reading those changes or those distances on that same string.

And using the code that I previously mentioned, the code for the substructure, we would be able to place those elements along that string. So here is the example of the abutments. We also have the example of our substructures. And these are, in this case, an ROP with four piles.

The superstructure was also placed, reading from this Excel spreadsheet. So there was the definition of the super T girders and then the deck placement on top of it. And as I said, this was our structural definition matrix. So this was all thanks to the engineering team that was driving again this spreadsheet. Over to you, Paul.

PAUL BRIEDIS: So another output of our delivery was understanding the quantities of the structures throughout the project. So we needed to find-- we're moving. It's very fast. We're responding to change very well. And we've got a good process to update the content and get that content to the individual modeling team across those 68 models and update it.

But we then had to several times a week extract the quantities from each of those models and build a sort of master quantities breakdown to then engage with our DW partners and the project partners to understand what's the cost, the program, the carbon, all of that.

And we needed a really slick way to do that. So for this, we used Autodesk Assemble. So a really quick breakdown as to how we were applying that. So we have our native Revit models. What we're then doing for our workflow, we're putting those into Navisworks.

So that was just generating an NWC. And with that Navisworks model, knowing that it's all in the right position and so on, and as you can imagine, of 130 kilometers, there's a lot of regular checking we needed to have to make sure there was no models flying off in space.

So we got those Navisworks models. From there, we're publishing these models to assemble and assemble as a cloud-based platform, which hosts the model data. But you've got the graphical and the nongraphical information represented there. And it's a really powerful tool to be able to interrogate and break down and manage that data and understand it in a very sophisticated manner.

So it's a really powerful tool, Assemble. From that, where we had all our models published, we could then-- I say all. Excuse me, just before the next step, there is an alternative. You can just publish straight to Revit. But for other reasons, we went through Navisworks. But there we go.

Once it's in Assemble, this is where we were grabbing an Excel export, but then also sort of dabbling with the Power BI side of things. And from that process, we were able to very regularly, several times a week, get what was happening on the Tuesday, at the end of Tuesday, go through this process, and at the end of Tuesday, capture everything, all of the quantities from each of the structures. And we could repeat that every day. And we were at some points.

So it was a really, really-- we needed this system to stop any sort of generation of schedules and things like that because that just would not have worked on this project.

Project summary. So I'll go with the challenges. So major project for us. Tender design program that was very, very challenging. And hey, it's just something that we've all come across. Responding to change. Major challenge, but we did that I think in the most effective way I think we could have done it.

Something I have to point out. We had the design team-- half the design team was sitting in Australia, and we were sitting in Europe. A lot of the DJV team was sitting in Europe. So very important point, that all of this work-- we're not working with the same people in our office. We're working with people with a couple of hours overlap during our working time to actually communicate these changes.

So that's actually a really, really significant thing to consider here is, we're doing all of this work, and we're trying to make it work by feeding each other good-quality information so when they start their working day, they can then continue with it. So a lot of work went into to try and make sure that we had that right level of communication.

Quantity extraction, a challenge, but we were able to do that I think in the most effective way. Keeping Excel format constant. What was that one, Igor?

IGOR VARAGILAL: Yeah, so we found out that throughout also the lifecycle that, although for this workflow to work, we needed to have the Excel constant because we are reading that information with the Dynamo script, but because we had so many engineers working, so many modelers working in that same Excel, sometimes that format was not-- well, it was not constant. So it's really important to have that format the same. And if that's not the case, then the workflow can break. So this was a challenge for sure.

Well, as one of the lessons learned was that we needed to have earlier engineering engagement. I mentioned that Excel spreadsheet. And this is also true, that if we had maybe an earlier engagement from the engineer, that Excel spreadsheet format would be defined from the beginning. Then, those changes throughout the lifecycle will not happen. And we will not have those challenges and those problems.

Also here, we saw during the video showing the master model, there was a lot of Revit families that were inserted after the project already started and after we modeled our bridges. So we believe that if we had this early engineering engagement that those families will be defined from the beginning, and then the variations of those families will also be defined. And that would be easier for everyone.

PAUL BRIEDIS: Something that we touched on on the previous project sample, it's a good lesson learned that we went a bit beyond the level of detail, level of definition of our modeled elements, so our sub and superstructures.

And the reason for that, so we added more detail than what was required for our models. And the real driver for that was to try and give us a more efficient method of driving the drawing production. So that was a major thing.

You have more detail in your drawing. Generally whilst you still have to do a lot of careful management of the drawing itself, that does relate to a very improved way of working. But also that higher LOD led to a better quantity take off. So there's a double-edged sword there. So that was one of the things we drove there.

So one of the things we also got from using Assemble, and this is something we're going to touch on in the final project we're going to go through, is using Assemble and having a full breakdown of the content of each of those models and subsequently the content of our deliverables, our structures, the engineers had a significant advantage over other projects where they could see what was in the models. They had that breakdown, not just visually in the Navisworks or BIM 360 or Revit, but they could see all the data of those components being used.

And that gave everybody so much higher confidence in what we were delivering was true and accurate. I mean, yes, we still made mistakes, and there were still errors, of course. We were not perfect in that respect, but we had so much more visibility and control and then subsequently confidence in what we were sharing with the client was indeed what we-- that was our design intent.

Assemble Power BI dashboard. What we didn't really do, we just did at the very end, playing around with Assemble and Power BI as a dashboard, that really would have helped us. But that was our lessons learned for the next job.

And here's the next job, our final project here, a major highway project in the UK. So detail design, again, 18 bridge and viaduct structures. So that's a big step up from the previous project in the UK. A lot of major and minor culverts. Again, we had a digitally mature client. And this same client was pushing us in the same way. So we were really working well and developing a very good collaborative relationship with this client.

A very digitally advanced project for a number of reasons, and we were still using in this case Revit 2020, Civil 3D same. But now, going into using Grasshopper and Rhino, which Igor will get to in a moment. Just to get some context, we'll just fly around the federated model. Excuse me.

So this is just a quick little 30-second video. So this is one part of the project. And Mott MacDonald was responsible for all the engineering services, design services for everything you see here.

So highways, bridges, geotech structures, earthworks, lighting, drainage, utilities, the whole lot. Yeah, so a normal complex detail design project with full model delivery and lots of embedded information, lots of attribute information required as part of this project. Igor.

IGOR VARAGILAL: So what did we do on this project? As Paul already mentioned, there were two key changes from the previous major highway project in the UK. The first one is that we used Rhino and Grasshopper, an algorithm-based approach for our bridge design.

And the second one is that we had greater model attribute requirements. In each of our bridge elements, we had to have a number of 27 attributes defined. And these were all client requirements.

This is a bigger picture of our workflow. It all started, as I said, in Rhino and Grasshopper. And our inputs are the highway 3D strings, where we input that into Rhino. And using, as I said, an algorithm-based approach, we are able to create the bridge geometry in Rhino.

And then, with Grasshopper, we are able to create the bridge geometry with Rhino and Grasshopper. So it's in Grasshopper that the geometry gets created. And from Grasshopper, we are exporting that model, an analytical model to Midas or SOFiSTIK.

So these are analytical softwares, where the engineering team is doing their calculations. And we are also able to export the delivery model. And that involves the Revit model, that after getting exported to Navisworks and also using Assemble and Power BI.

Just looking at the first step of our workflow, this was creating that geometry in Rhino using Grasshopper, with an input as the highway string. So as I said, it is the highway strings imported into Rhino. We use Grasshopper, and we are exporting this to Revit.

We are doing this using Rhino inside Revit, but I will show that on the next step of the project or the other workflow. So the next step of the workflow, before we even go to the delivery model, to Revit, is to export this geometry to the analytical software, either SOFiSTIK or Midas. So we are doing this through two types of files, depending on the software that we are doing.

And this file currently contains the geometry information, but for some of the cases, we are also exporting some loads or the axis of the carriageway for some of the bridges that will allow after the engineer to place the live load on that axis. It's very important to say that we are continually to improve our ability to apply this workflow. And we are doing it in particular around the connection between this graphical and analytical model. And we want to get the speed with which we are generating the model as fast as possible.

Mott MacDonald has invested a lot of time and effort developing this workflow. And we are now gaining significant working knowledge about the benefits and also the shortcomings of this workflow and what this can bring into our project team and the efficiencies that we can get.

The next step is to export this geometry into Revit. And you see here an example of our Revit export in Grasshopper. Although this looks messy right now, it's important to say that we have been evolving our workflow. We have been tidying it up. And you see there's some color coding. We have been trying to make it, again, as easy as possible for other users to start using it.

So this is the geometry that we have now in Grasshopper. And as I said before, this gets exported to Revit using the plugin that some of you in audience may be aware of, which is the Rhino inside Revit. And this is after then the final product that we get in Revit.

From Revit, we go to Navisworks, and we also use a Power BI and Assemble combination to check the attributes. Those 27 attributes that I mentioned before, they are getting checked after we exported those same attribute information from Revit.

It's very important to also at this point compare this workflow that we are using now to what we were doing, for example, in the first major project that we did in the UK. Now we are using still an algorithm-based approach, but now it's with Rhino and Grasshopper.

To be quite honest, although we were not quite right there in the beginning of the project, as I said, we have been developing this in the past eight, nine months. And we are now starting to see that there is advantages to using Rhino and Grasshopper. And we are getting more and more comfortable in using this workflow.

PAUL BRIEDIS: The attribute information that we had on this project, which was a big step up, and it was the largest that our team had to work with. So for all elements across all the disciplines, there was a collection of attributes which our client had defined and said, right, all of these model elements must contain a value against these different attributes, more or less. Yeah. Some attributes were sort of picked up at different stages of the project. But we had a collection of attributes, 27-odd, which we needed to ensure had the correct information in those fields.

And the problem we had, like in that rail project in Australia, is we've got all these models with all these elements. How do we get the project team understanding what's in there before we issue to the client? So we've got our federated model submission every two weeks, which is pretty common. Let's just say 10 of those structures get updated for that submission.

How do we get the team, not just the-- sorry, not just the author, who's generated the model, but then the person who's checking their model through the different workflows and then finally delivering it to that discipline lead to say, yes, these are all complying with the project requirements.

So we had our client was checking that these were all complying themselves, the values of which what we were defining. So we needed a system to understand how this would work. So we reached for Assemble, like I mentioned in the previous project, but here-- so this is where we're publishing our models to Assemble.

And now we're building on top of that a Power BI dashboard. So with that dashboard, we're now filtering through all of the different attributes from, not just the file name, but also the type of layer, which is all defined through the project, the load matrix we had for the project.

And we can see here-- I'm just flicking between a couple of different structures and also different types of components of the structures. And you can see here, we've got the material, the material grade, and the reinforcement ratio all coming up. So this is how we can see the values. We've also got another checks, which we're looking at the-- actually verifying, do these values comply with the Project Standards?

Because sometimes you could have a concrete grade, which it was satisfied that it had the correct, for instance, just spelling of concrete cast in place as an example, but it may not be assigned to the correct elements.

So there was lots of different variants we had to try and account for. And again, this dashboard really opened up the connection between the engineer who designed, and they owned these bridge structures, and the author of the model, so that's the technician. And those who are in Grasshopper and Rhino, which a lot of these attributes were still defined at that early stage, which Igor just mentioned.

So this was a really-- this now is becoming a business as usual for us because we need our engineering team, our project teams to understand and have confidence that what we're submitting and sharing with our clients is complying with the Project Standards. It's a fundamental thing of the way we're working.

So the challenge is on the project. Big project, lots of challenges, multi-discipline project, just those ones are just always there. So more specifics. The challenges we had was the definition of the highway strings.

This was different from when we were just getting the strings and even ourselves grabbing the strings from the Civil 3D corridor and then bringing that into our Revit model to do the work. We really needed some clean definition. When I say clean, we needed some very defined outputs from the highways, the Civil 3D corridors.

So all the naming had to be correct and actually even the length of the strings to bring into the Grasshopper and run an algorithm had to be very well defined and then executed. So that was a challenge for us.

This algorithm-based approach and those using it, that was a challenge. As Igor mentioned, we started eight, nine months ago using this. And that was a big challenge. But week by week, month by month, we're getting better and more comfortable with it. And we're getting the value out of it.

Completeness and consistency of attribute information. Like I mentioned, it was a challenge. And it's important to point out that these attributes that need to be populated, they don't need to be populated at the end of the job as final handover. They needed to be populated as early as possible because when they're defined as early as possible, our client can understand the asset information, the cost, the program, the carbon output from that information. So we had to reach that high level of completeness of that attribute definition early in the project. Lesson learned. Igor.

IGOR VARAGILAL: Thank you, Paul. First lesson learned that we are highlighting here is the communication between disciplines. We talked about the challenge of the definition of those highway strings, but in order for us to model our bridge and sometimes to produce our drawings, we needed to have a clean communication also with some elements of the drainage team. And we needed to include those outputs from those teams as well. And sometimes, that would be complicated. Paul.

PAUL BRIEDIS: Yeah, just I suppose on that, the communication with disciplines is one thing, but maybe just to clarify a little bit more. It's about disciplines understanding what other disciplines need. And communication's good, but if you don't actually understand what the other member of staff needs, the challenge becomes a lot harder.

So it's understanding what other disciplines need. And it, again, sounds fundamental, but it's something that's very hard to do. Remember, half of us are working from home still. And half of us are different offices. So that's a big part of it. So Igor. Oh, it is?

IGOR VARAGILAL: The second lesson learned was that the Grasshopper and the Rhino, the combination of these two softwares, they allow us to define this complex geometry a bit better. We mentioned as a challenge for the first major project, major highway project in the UK, that sometimes that will be a bit difficult to do it in Revit and Dynamo. And as some of you may know, Grasshopper and Rhino is a bit better to create and to develop these complex geometries.

The model QA and QC to enhance the quality of the deliverables, this is what Paul was saying, that when we are able to see when both the engineer or the author of the model are able to see the output, and we are able to have that as a dashboard, definitely enhances the quality of our deliverables.

And then, just the role of the drawings. As a lesson learned, we found out that-- or we didn't find out. We just, we knew about this, but it was just enforced that the drawings is definitely one of the main outputs. And this is what the engineering team is looking at as an output.

PAUL BRIEDIS: Yeah. Just on that one. The role of the drawings, yeah. We're doing all this great stuff now with Grasshopper and Rhino. And the engineer is very much involved in that process. At the end of the day, drawings are still fundamental to their understanding and their understanding of the bridge itself and the challenges and the issues and the solutions that are involved in it.

So drawings is something that we always want to try and remove because we want to remove the drawings. But it's still very much there, and it's still very much a part of that early design work that's been done in that geometry creation in Grasshopper and Rhino. We need those to be very closely following that initial algorithm because that is really still where our engineers need to be focusing on. Yeah, so the role of drawings, it sort of came into a different light in this work.

Right. Closing. So anything else? Other little key points. So Moata Intelligent Content. So let's talk about how we're driving the quality and consistency across our team. So Mott MacDonald have developed Moata Intelligent Content. Now, this is a platform, an online tool, which by publishing verified data, so in this instance, we've got our models. So this is a railing, a parametric model of a railing. So all components from all disciplines that we as Motts are generating across each of our teams, we're able to go and have this in a very easily accessible online library or a catalog for staff to be using.

So here, I'm just demonstrating how we're inserting a railing onto one of our pile caps. And the whole point is, where through the use of this tool and sharing and uploading of this content, we're able to have much more of a-- well, it speaks for itself, the value.

But it enables us to have more of a playbook approach to bridge delivery. You need to do this, you go there. You need to do that, you go here. And this satisfies one of those things. So we want consistency across our team. We want to be able to ensure that what we include in our models is clean and accurate.

This is a good example of how the information that we're seeing in Assemble and Power BI, it can be driven through this. So if we know people are all using the same thing that we're going to get the same result. And it stops people wasting time, and we become so much more efficient about it. So Moata Intelligent Content is a big part of what we're doing. And we're very keen on promoting this.

We touched on at the very start, Inventor and InfraWorks. So it's got its place in this workflow. And some of you may have been thinking, well, when are we going to use it? So we've worked on projects where this could have been used, and we sort of-- it was a bit of a challenge for us.

And we are now currently working on one project with Inventor, creating parametric content, which we're looking at here, just an apartment or a parapet, and bringing this content into InfraWorks.

And what we're able to do, and Autodesk are investing a lot of energy and added functionality in the bridge component of InfraWorks. So what we've got here is this is an early stage project where we've got the actual Civil 3D model linked dynamically to this InfraWorks model. Let me see, right, the bridge starts here and here. We're creating our bespoke content in Inventor and then importing it into InfraWorks. And we're creating our bridge.

So these tools which we're putting in place, it's quite straightforward. InfraWorks is a very easy tool to use. But when you have a good library of content developed through InfraWorks, and as I said, this is software, which is very much advancing quite steadily. And I encourage you to look at it because this tool, for this early stage design, it really helps-- or in our case, it's really helping our engineers understand three, five, six-man bridges, what's the girder positioning?

What are the height constraints we have on our design model with the girder depth? All of these real fundamental but early design aspects, which you want to consider, we're getting value out of InfraWorks by doing that. And there's other workflows, exporting to Revit and Civil 3D, which we're not going to go into here. But it is a very powerful tool. Igor.

IGOR VARAGILAL: Yeah. Something else that we are also doing, and this is always parallel to our project work. And we touched on it when we were talking about the same project work. But we are very keen on developing the capabilities and just what we can do with the software.

So we've been continuously training our team here. We developed these trainings, which we call the Dynamo and Revit Ninja. So this was to give the users a dedicated time to go and spend that time in learning the software because we also know that you only learn Revit and Dynamo-- you can learn it through the trainings that you do. But it is experience that is actually going to give you the tools to actually have to use it in the project.

Same thing with Python. We have recently started using also-- or not using, but we started having some oriented training towards Python. Again, we like to give funny names to our training. So in this one, we call it PyCharmer.

Another thing that we also did across these projects, and every time that we did-- or we finished a project, and we believe that we had a really good workflow that needed to be demonstrated to the rest of the business, the rest of the group, we would go and do this technical demonstration. So in reality, the presentation that we are now showing you here, the case study that we are showing here, is a result of all those technical demonstrations of those workflows that we've been talking about.

And also, very important is this internal and this external training. We've been doing the training like the Dynamo and Revit Ninja of our team or the PyCharmer, but we also look outwards to get more abilities and capabilities with the software. What are the next steps and the lessons learned?

PAUL BRIEDIS: So these are the ones we highlighted at the start. So the work's not done. What's for the next five years and beyond? So still focusing on that single source of truth approach. So that Grasshopper and Rhino approach is really-- there's a lot of value to be gained here, and there's a lot of effort needed to get the most out of that. And that's what we want to continue working on.

The drawing production. The drawing production. Yeah. How do we go and ensure that that becomes as easy as possible? A real challenge for us because it's a really industry and client-driven deliverable. So yeah, not too much more to be said about that in the time I've got available.

Knowledge sharing. As Igor pointed out, it's ongoing. It's not as though we're going to do something any different. It just has to be maintained. But what we sort of specifically want to do is whenever we've got a-- as Igor pointed out-- we've got a solution, we think it works, we know that there's something that is better than what we did yesterday, we need to make sure that the team of tomorrow knows about it because we can't just keep it trapped within one team. We have to share it.

And very, very big, I think, lessons learned here. When you share your experience, you get a lot back because people do respond to that. And I definitely encourage you to do that within your teams. And this is why we're here at AU sharing and attending these presentations.

Improved team engagement. And this is sort of touching on a couple of things. Understanding how other teams work and understanding about how disciplines and what they're trying to achieve is really fundamental to this. But it comes down to the early project planning. And we all understand the importance of project planning. But we want to make sure that we have the-- we want to make sure we have a really good clarity on what each other need for success.

I'll just touch on technology, and I'll leave the last part for Igor. Capabilities limitations. We've mentioned that, the information management side of things. BIM 260 and ProjectWise, it continues to be a big hurdle in other platforms.

But it continues to be a big challenge for us. Large projects, working across different continents and time zones, lots of information going backwards and forwards. It's very challenging for us to do our job.

And everything we've seen here with some smart and dedicated people delivering these bridges and other assets, they can only do their job when they've got access to the right data. And that continues to be an ongoing challenge for us. And we've got a lot of solutions we're trying to draw on, we're trying to apply here.

Interoperability between the model and the data. Sort of touched on before. Reducing repetitive tasks or automation. Fundamental. We all know it. But if we're doing all those smart things with bridge delivery and other assets, well, we also have to focus on reducing the time spent on these lower-value but still necessary tasks. And that has to stay as front of mind. Keeping up with the technology advancements. Hey, that's why we're at AU.

IGOR VARAGILAL: The last ongoing challenge that we're still having, you can see here, it's not just a pretty looking 3D model or the evolution, not a revolution. These are the ones that we mentioned in the beginning. But we also believe that we still want to keep developing the understanding and the value of digital delivery. And we do it by sharing. We do it by doing these case studies, like the first project that we showed here where we were, understanding the value that we would get out of a workflow.

So we want to show that the value that we are doing and the value of digital delivery, we want to show that value from the start. And we understand that there's still work to be done. Well, that's why we are here, to talk and to present and to share this information with all of us.

Just some final remarks. As I said, or as we said, this is a continuous process. We've been doing it for five years, but we believe that this is going to continue. This is an evolution, not a revolution.

We are not trying to change hearts and minds overnight. So we want to continue to go down this path, down this process. And as I just said, it's continuing evolution.

And we believe that it's also very important that we continue to challenge ourselves, yourselves or ourselves in this case. And when we have a project and when we have a workflow, we have to continue to challenge ourselves, if that's the best way to do it.

In one of the projects that we mentioned, the rail project, the tender design in Australia, we believe that we would not be able to do that if we didn't have such an engaged team, such a team that would push us to do the things that we did and push us to create the workflow that would fast respond to the change as we did. So we believe that it's very important to continue to challenge ourselves, and this is important for innovation overall.

PAUL BRIEDIS: Thank you very much.

Downloads

관련 연극대화
______
icon-svg-close-thick

쿠기 기본 설정

오토데스크는 고객의 개인 정보와 최상의 경험을 중요시합니다. 오토데스크는 정보를 사용자화하고 응용프로그램을 만들기 위해 고객의 본 사이트 사용에 관한 데이터를 수집합니다.

오토데스크에서 고객의 데이터를 수집하고 사용하도록 허용하시겠습니까?

오토데스크에서 사용하는타사 서비스개인정보 처리방침 정책을 자세히 알아보십시오.

반드시 필요 - 사이트가 제대로 작동하고 사용자에게 서비스를 원활하게 제공하기 위해 필수적임

이 쿠키는 오토데스크에서 사용자 기본 설정 또는 로그인 정보를 저장하거나, 사용자 요청에 응답하거나, 장바구니의 품목을 처리하기 위해 필요합니다.

사용자 경험 향상 – 사용자와 관련된 항목을 표시할 수 있게 해 줌

이 쿠키는 오토데스크가 보다 향상된 기능을 제공하고 사용자에게 맞는 정보를 제공할 수 있게 해 줍니다. 사용자에게 맞는 정보 및 환경을 제공하기 위해 오토데스크 또는 서비스를 제공하는 협력업체에서 이 쿠키를 설정할 수 있습니다. 이 쿠키를 허용하지 않을 경우 이러한 서비스 중 일부 또는 전체를 이용하지 못하게 될 수 있습니다.

광고 수신 설정 – 사용자에게 타겟팅된 광고를 제공할 수 있게 해 줌

이 쿠키는 사용자와 관련성이 높은 광고를 표시하고 그 효과를 추적하기 위해 사용자 활동 및 관심 사항에 대한 데이터를 수집합니다. 이렇게 데이터를 수집함으로써 사용자의 관심 사항에 더 적합한 광고를 표시할 수 있습니다. 이 쿠키를 허용하지 않을 경우 관심 분야에 해당되지 않는 광고가 표시될 수 있습니다.

icon-svg-close-thick

타사 서비스

각 범주에서 오토데스크가 사용하는 타사 서비스와 온라인에서 고객으로부터 수집하는 데이터를 사용하는 방식에 대해 자세히 알아보십시오.

icon-svg-hide-thick

icon-svg-show-thick

반드시 필요 - 사이트가 제대로 작동하고 사용자에게 서비스를 원활하게 제공하기 위해 필수적임

Qualtrics
오토데스크는 고객에게 더욱 시의적절하며 관련 있는 이메일 컨텐츠를 제공하기 위해 Qualtrics를 이용합니다. 이를 위해, 고객의 온라인 행동 및 오토데스크에서 전송하는 이메일과의 상호 작용에 관한 데이터를 수집합니다. 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 이메일 확인율, 클릭한 링크 등이 포함될 수 있습니다. 오토데스크는 이 데이터를 다른 소스에서 수집된 데이터와 결합하여 고객의 판매 또는 고객 서비스 경험을 개선하며, 고급 분석 처리에 기초하여 보다 관련 있는 컨텐츠를 제공합니다. Qualtrics 개인정보취급방침
Akamai mPulse
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Akamai mPulse를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Akamai mPulse 개인정보취급방침
Digital River
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Digital River를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Digital River 개인정보취급방침
Dynatrace
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Dynatrace를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Dynatrace 개인정보취급방침
Khoros
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Khoros를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Khoros 개인정보취급방침
Launch Darkly
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Launch Darkly를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Launch Darkly 개인정보취급방침
New Relic
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 New Relic를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. New Relic 개인정보취급방침
Salesforce Live Agent
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Salesforce Live Agent를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Salesforce Live Agent 개인정보취급방침
Wistia
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Wistia를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Wistia 개인정보취급방침
Tealium
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Tealium를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Upsellit
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Upsellit를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. CJ Affiliates
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 CJ Affiliates를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Commission Factory
Typepad Stats
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Typepad Stats를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Typepad Stats 개인정보취급방침
Geo Targetly
Autodesk는 Geo Targetly를 사용하여 웹 사이트 방문자를 가장 적합한 웹 페이지로 안내하거나 위치를 기반으로 맞춤형 콘텐츠를 제공합니다. Geo Targetly는 웹 사이트 방문자의 IP 주소를 사용하여 방문자 장치의 대략적인 위치를 파악합니다. 이렇게 하면 방문자가 (대부분의 경우) 현지 언어로 된 콘텐츠를 볼 수 있습니다.Geo Targetly 개인정보취급방침
SpeedCurve
Autodesk에서는 SpeedCurve를 사용하여 웹 페이지 로드 시간과 이미지, 스크립트, 텍스트 등의 후속 요소 응답성을 측정하여 웹 사이트 환경의 성능을 모니터링하고 측정합니다. SpeedCurve 개인정보취급방침
Qualified
Qualified is the Autodesk Live Chat agent platform. This platform provides services to allow our customers to communicate in real-time with Autodesk support. We may collect unique ID for specific browser sessions during a chat. Qualified Privacy Policy

icon-svg-hide-thick

icon-svg-show-thick

사용자 경험 향상 – 사용자와 관련된 항목을 표시할 수 있게 해 줌

Google Optimize
오토데스크는 사이트의 새 기능을 테스트하고 이러한 기능의 고객 경험을 사용자화하기 위해 Google Optimize을 이용합니다. 이를 위해, 고객이 사이트를 방문해 있는 동안 행동 데이터를 수집합니다. 이 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 오토데스크 ID 등이 포함될 수 있습니다. 고객은 기능 테스트를 바탕으로 여러 버전의 오토데스크 사이트를 경험하거나 방문자 특성을 바탕으로 개인화된 컨텐츠를 보게 될 수 있습니다. Google Optimize 개인정보취급방침
ClickTale
오토데스크는 고객이 사이트에서 겪을 수 있는 어려움을 더 잘 파악하기 위해 ClickTale을 이용합니다. 페이지의 모든 요소를 포함해 고객이 오토데스크 사이트와 상호 작용하는 방식을 이해하기 위해 세션 녹화를 사용합니다. 개인적으로 식별 가능한 정보는 가려지며 수집되지 않습니다. ClickTale 개인정보취급방침
OneSignal
오토데스크는 OneSignal가 지원하는 사이트에 디지털 광고를 배포하기 위해 OneSignal를 이용합니다. 광고는 OneSignal 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 OneSignal에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 OneSignal에 제공하는 데이터를 사용합니다. OneSignal 개인정보취급방침
Optimizely
오토데스크는 사이트의 새 기능을 테스트하고 이러한 기능의 고객 경험을 사용자화하기 위해 Optimizely을 이용합니다. 이를 위해, 고객이 사이트를 방문해 있는 동안 행동 데이터를 수집합니다. 이 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 오토데스크 ID 등이 포함될 수 있습니다. 고객은 기능 테스트를 바탕으로 여러 버전의 오토데스크 사이트를 경험하거나 방문자 특성을 바탕으로 개인화된 컨텐츠를 보게 될 수 있습니다. Optimizely 개인정보취급방침
Amplitude
오토데스크는 사이트의 새 기능을 테스트하고 이러한 기능의 고객 경험을 사용자화하기 위해 Amplitude을 이용합니다. 이를 위해, 고객이 사이트를 방문해 있는 동안 행동 데이터를 수집합니다. 이 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 오토데스크 ID 등이 포함될 수 있습니다. 고객은 기능 테스트를 바탕으로 여러 버전의 오토데스크 사이트를 경험하거나 방문자 특성을 바탕으로 개인화된 컨텐츠를 보게 될 수 있습니다. Amplitude 개인정보취급방침
Snowplow
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Snowplow를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Snowplow 개인정보취급방침
UserVoice
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 UserVoice를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. UserVoice 개인정보취급방침
Clearbit
Clearbit를 사용하면 실시간 데이터 보강 기능을 통해 고객에게 개인화되고 관련 있는 환경을 제공할 수 있습니다. Autodesk가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. Clearbit 개인정보취급방침
YouTube
YouTube는 사용자가 웹 사이트에 포함된 비디오를 보고 공유할 수 있도록 해주는 비디오 공유 플랫폼입니다. YouTube는 비디오 성능에 대한 시청 지표를 제공합니다. YouTube 개인정보보호 정책

icon-svg-hide-thick

icon-svg-show-thick

광고 수신 설정 – 사용자에게 타겟팅된 광고를 제공할 수 있게 해 줌

Adobe Analytics
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Adobe Analytics를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID 및 오토데스크 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. Adobe Analytics 개인정보취급방침
Google Analytics (Web Analytics)
오토데스크 사이트에서 고객의 행동에 관한 데이터를 수집하기 위해 Google Analytics (Web Analytics)를 이용합니다. 여기에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 오토데스크는 사이트 성과를 측정하고 고객의 온라인 경험의 편리함을 평가하여 기능을 개선하기 위해 이러한 데이터를 이용합니다. 또한, 이메일, 고객 지원 및 판매와 관련된 고객 경험을 최적화하기 위해 고급 분석 방법도 사용하고 있습니다. AdWords
Marketo
오토데스크는 고객에게 더욱 시의적절하며 관련 있는 이메일 컨텐츠를 제공하기 위해 Marketo를 이용합니다. 이를 위해, 고객의 온라인 행동 및 오토데스크에서 전송하는 이메일과의 상호 작용에 관한 데이터를 수집합니다. 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 이메일 확인율, 클릭한 링크 등이 포함될 수 있습니다. 오토데스크는 이 데이터를 다른 소스에서 수집된 데이터와 결합하여 고객의 판매 또는 고객 서비스 경험을 개선하며, 고급 분석 처리에 기초하여 보다 관련 있는 컨텐츠를 제공합니다. Marketo 개인정보취급방침
Doubleclick
오토데스크는 Doubleclick가 지원하는 사이트에 디지털 광고를 배포하기 위해 Doubleclick를 이용합니다. 광고는 Doubleclick 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Doubleclick에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Doubleclick에 제공하는 데이터를 사용합니다. Doubleclick 개인정보취급방침
HubSpot
오토데스크는 고객에게 더욱 시의적절하며 관련 있는 이메일 컨텐츠를 제공하기 위해 HubSpot을 이용합니다. 이를 위해, 고객의 온라인 행동 및 오토데스크에서 전송하는 이메일과의 상호 작용에 관한 데이터를 수집합니다. 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 이메일 확인율, 클릭한 링크 등이 포함될 수 있습니다. HubSpot 개인정보취급방침
Twitter
오토데스크는 Twitter가 지원하는 사이트에 디지털 광고를 배포하기 위해 Twitter를 이용합니다. 광고는 Twitter 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Twitter에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Twitter에 제공하는 데이터를 사용합니다. Twitter 개인정보취급방침
Facebook
오토데스크는 Facebook가 지원하는 사이트에 디지털 광고를 배포하기 위해 Facebook를 이용합니다. 광고는 Facebook 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Facebook에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Facebook에 제공하는 데이터를 사용합니다. Facebook 개인정보취급방침
LinkedIn
오토데스크는 LinkedIn가 지원하는 사이트에 디지털 광고를 배포하기 위해 LinkedIn를 이용합니다. 광고는 LinkedIn 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 LinkedIn에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 LinkedIn에 제공하는 데이터를 사용합니다. LinkedIn 개인정보취급방침
Yahoo! Japan
오토데스크는 Yahoo! Japan가 지원하는 사이트에 디지털 광고를 배포하기 위해 Yahoo! Japan를 이용합니다. 광고는 Yahoo! Japan 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Yahoo! Japan에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Yahoo! Japan에 제공하는 데이터를 사용합니다. Yahoo! Japan 개인정보취급방침
Naver
오토데스크는 Naver가 지원하는 사이트에 디지털 광고를 배포하기 위해 Naver를 이용합니다. 광고는 Naver 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Naver에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Naver에 제공하는 데이터를 사용합니다. Naver 개인정보취급방침
Quantcast
오토데스크는 Quantcast가 지원하는 사이트에 디지털 광고를 배포하기 위해 Quantcast를 이용합니다. 광고는 Quantcast 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Quantcast에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Quantcast에 제공하는 데이터를 사용합니다. Quantcast 개인정보취급방침
Call Tracking
오토데스크는 캠페인을 위해 사용자화된 전화번호를 제공하기 위하여 Call Tracking을 이용합니다. 그렇게 하면 고객이 오토데스크 담당자에게 더욱 빠르게 액세스할 수 있으며, 오토데스크의 성과를 더욱 정확하게 평가하는 데 도움이 됩니다. 제공된 전화번호를 기준으로 사이트에서 고객 행동에 관한 데이터를 수집할 수도 있습니다. Call Tracking 개인정보취급방침
Wunderkind
오토데스크는 Wunderkind가 지원하는 사이트에 디지털 광고를 배포하기 위해 Wunderkind를 이용합니다. 광고는 Wunderkind 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Wunderkind에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Wunderkind에 제공하는 데이터를 사용합니다. Wunderkind 개인정보취급방침
ADC Media
오토데스크는 ADC Media가 지원하는 사이트에 디지털 광고를 배포하기 위해 ADC Media를 이용합니다. 광고는 ADC Media 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 ADC Media에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 ADC Media에 제공하는 데이터를 사용합니다. ADC Media 개인정보취급방침
AgrantSEM
오토데스크는 AgrantSEM가 지원하는 사이트에 디지털 광고를 배포하기 위해 AgrantSEM를 이용합니다. 광고는 AgrantSEM 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 AgrantSEM에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 AgrantSEM에 제공하는 데이터를 사용합니다. AgrantSEM 개인정보취급방침
Bidtellect
오토데스크는 Bidtellect가 지원하는 사이트에 디지털 광고를 배포하기 위해 Bidtellect를 이용합니다. 광고는 Bidtellect 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Bidtellect에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Bidtellect에 제공하는 데이터를 사용합니다. Bidtellect 개인정보취급방침
Bing
오토데스크는 Bing가 지원하는 사이트에 디지털 광고를 배포하기 위해 Bing를 이용합니다. 광고는 Bing 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Bing에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Bing에 제공하는 데이터를 사용합니다. Bing 개인정보취급방침
G2Crowd
오토데스크는 G2Crowd가 지원하는 사이트에 디지털 광고를 배포하기 위해 G2Crowd를 이용합니다. 광고는 G2Crowd 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 G2Crowd에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 G2Crowd에 제공하는 데이터를 사용합니다. G2Crowd 개인정보취급방침
NMPI Display
오토데스크는 NMPI Display가 지원하는 사이트에 디지털 광고를 배포하기 위해 NMPI Display를 이용합니다. 광고는 NMPI Display 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 NMPI Display에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 NMPI Display에 제공하는 데이터를 사용합니다. NMPI Display 개인정보취급방침
VK
오토데스크는 VK가 지원하는 사이트에 디지털 광고를 배포하기 위해 VK를 이용합니다. 광고는 VK 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 VK에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 VK에 제공하는 데이터를 사용합니다. VK 개인정보취급방침
Adobe Target
오토데스크는 사이트의 새 기능을 테스트하고 이러한 기능의 고객 경험을 사용자화하기 위해 Adobe Target을 이용합니다. 이를 위해, 고객이 사이트를 방문해 있는 동안 행동 데이터를 수집합니다. 이 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역, IP 주소 또는 장치 ID, 오토데스크 ID 등이 포함될 수 있습니다. 고객은 기능 테스트를 바탕으로 여러 버전의 오토데스크 사이트를 경험하거나 방문자 특성을 바탕으로 개인화된 컨텐츠를 보게 될 수 있습니다. Adobe Target 개인정보취급방침
Google Analytics (Advertising)
오토데스크는 Google Analytics (Advertising)가 지원하는 사이트에 디지털 광고를 배포하기 위해 Google Analytics (Advertising)를 이용합니다. 광고는 Google Analytics (Advertising) 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Google Analytics (Advertising)에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Google Analytics (Advertising)에 제공하는 데이터를 사용합니다. Google Analytics (Advertising) 개인정보취급방침
Trendkite
오토데스크는 Trendkite가 지원하는 사이트에 디지털 광고를 배포하기 위해 Trendkite를 이용합니다. 광고는 Trendkite 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Trendkite에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Trendkite에 제공하는 데이터를 사용합니다. Trendkite 개인정보취급방침
Hotjar
오토데스크는 Hotjar가 지원하는 사이트에 디지털 광고를 배포하기 위해 Hotjar를 이용합니다. 광고는 Hotjar 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Hotjar에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Hotjar에 제공하는 데이터를 사용합니다. Hotjar 개인정보취급방침
6 Sense
오토데스크는 6 Sense가 지원하는 사이트에 디지털 광고를 배포하기 위해 6 Sense를 이용합니다. 광고는 6 Sense 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 6 Sense에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 6 Sense에 제공하는 데이터를 사용합니다. 6 Sense 개인정보취급방침
Terminus
오토데스크는 Terminus가 지원하는 사이트에 디지털 광고를 배포하기 위해 Terminus를 이용합니다. 광고는 Terminus 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 Terminus에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 Terminus에 제공하는 데이터를 사용합니다. Terminus 개인정보취급방침
StackAdapt
오토데스크는 StackAdapt가 지원하는 사이트에 디지털 광고를 배포하기 위해 StackAdapt를 이용합니다. 광고는 StackAdapt 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 StackAdapt에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 StackAdapt에 제공하는 데이터를 사용합니다. StackAdapt 개인정보취급방침
The Trade Desk
오토데스크는 The Trade Desk가 지원하는 사이트에 디지털 광고를 배포하기 위해 The Trade Desk를 이용합니다. 광고는 The Trade Desk 데이터와 고객이 사이트를 방문하는 동안 오토데스크가 수집하는 행동 데이터 모두에 기초하여 제공됩니다. 오토데스크가 수집하는 데이터에는 고객이 방문한 페이지, 시작한 체험판, 재생한 동영상, 구매 내역 및 IP 주소 또는 장치 ID가 포함될 수 있습니다. 이 정보는 The Trade Desk에서 고객으로부터 수집한 데이터와 결합될 수 있습니다. 오토데스크는 디지털 광고 경험에 대한 사용자화를 개선하고 고객에게 더욱 관련 있는 광고를 제시하기 위해 The Trade Desk에 제공하는 데이터를 사용합니다. The Trade Desk 개인정보취급방침
RollWorks
We use RollWorks to deploy digital advertising on sites supported by RollWorks. Ads are based on both RollWorks data and behavioral data that we collect while you’re on our sites. The data we collect may include pages you’ve visited, trials you’ve initiated, videos you’ve played, purchases you’ve made, and your IP address or device ID. This information may be combined with data that RollWorks has collected from you. We use the data that we provide to RollWorks to better customize your digital advertising experience and present you with more relevant ads. RollWorks Privacy Policy

정말 더 적은 온라인 경험을 원하십니까?

오토데스크는 고객 여러분에게 좋은 경험을 드리고 싶습니다. 이전 화면의 범주에 대해 "예"를 선택하셨다면 오토데스크는 고객을 위해 고객 경험을 사용자화하고 향상된 응용프로그램을 제작하기 위해 귀하의 데이터를 수집하고 사용합니다. 언제든지 개인정보 처리방침을 방문해 설정을 변경할 수 있습니다.

고객의 경험. 고객의 선택.

오토데스크는 고객의 개인 정보 보호를 중요시합니다. 오토데스크에서 수집하는 정보는 오토데스크 제품 사용 방법, 고객이 관심을 가질 만한 정보, 오토데스크에서 더욱 뜻깊은 경험을 제공하기 위한 개선 사항을 이해하는 데 도움이 됩니다.

오토데스크에서 고객님께 적합한 경험을 제공해 드리기 위해 고객님의 데이터를 수집하고 사용하도록 허용하시겠습니까?

선택할 수 있는 옵션을 자세히 알아보려면 이 사이트의 개인 정보 설정을 관리해 사용자화된 경험으로 어떤 이점을 얻을 수 있는지 살펴보거나 오토데스크 개인정보 처리방침 정책을 확인해 보십시오.