automation

Build Your Own Unassisted PowerShell Uninstallers

A fair chunk of what I do these days in the office is around testing software prior to packaging and deployment.

If you have ever had to install and uninstall Autodesk software for testing purposes, or you just wanted to get rid of an old version of the software you’d know that it’s not as simple as it probably should be. Rather than just uninstall Revit, you need to uninstall Revit and a whole host of other applications.

Yep.. that’s a lot of clicking

A while ago, I posted a solution for how to uninstall the 2015 Building Design Suite with PowerShell, the problem is however that this solution no longer works with the current Powershell; it was written for v2.x and Windows 10 is deployed with v5.x

To get the job done in Windows 10, first we want to get a list of all the installed applications on the machine, I just want the name of each package so we need to type the following at the PowerShell prompt. Of course make sure that you’re running PowerShell as an administrator.

Get-WmiObject -Class Win32_Product -Computer . | select-object Name | Export-CSV -path c:\ListSoftwareResults.csv -notypeinformation

This produces a handy little *.csv file with a list of all the installed applications

You can actually pull more information than just the name, it’s as simple as separating the properties with a comma

Get-WmiObject -Class Win32_Product -Computer . | select-object IdentifyingNumber,Name,Vendor,Version,Caption,LocalPackage | Export-CSV -path c:\ListSoftwareResults.csv -notypeinformation

From here we need to wrap the names of our software into this handy little script.

# Remove applications if installed.
$programs = @(
"Software Name 1",
"Software Name 2"
)
foreach($program in $programs){
Write-Host "Looking for $program."
$app = Get-WmiObject -Class Win32_Product ` -filter "Name = '$program'"
if ($app -ne $Null) {
Write-Host "Uninstalling $program."
$app.Uninstall()
Write-Host "$program uninstalled."
}
else
{
Write-Host "$program not found."
}
}

So for example, if we just wanted to uninstall Revit 2019 and it’s associated packages, we would use the following

# Remove Revit 2019 applications if installed.
$programs = @("Autodesk Revit 2019.2",
"Autodesk Revit 2019.1",
"Autodesk BIM 360 Revit 2019 Add-in 64 bit",
"Autodesk Revit Infraworks Updater",
"FormIt Converter For Revit 2019",
"Revit 2019",
"Autodesk Revit 2019 MEP Fabrication Configuration - Metric",
"Autodesk Advanced Material Library Base Resolution Image Library 2019",
"Batch Print for Autodesk Revit 2019",
"Autodesk Collaboration for Revit 2019",
"Autodesk Material Library Medium Resolution Image Library 2019",
"Worksharing Monitor for Autodesk Revit 2019",
"Autodesk Revit 2019 MEP Fabrication Configuration - Imperial",
"Autodesk Material Library Low Resolution Image Library 2019",
"Autodesk Revit Model Review 2019",
"Autodesk Advanced Material Library Low Resolution Image Library 2019",
"Autodesk Workflows 2019",
"Autodesk Material Library Base Resolution Image Library 2019",
"eTransmit for Autodesk Revit 2019",
"Autodesk Material Library 2019",
"Revit IFC 2019",
"Autodesk Revit Content Libraries 2019",
"BIM Interoperability Tools for Revit 2019",
"Autodesk Advanced Material Library Medium Resolution Image Library 2019")
foreach($program in $programs){
Write-Host "Looking for $program."
$app = Get-WmiObject -Class Win32_Product ` -filter "Name = '$program'"
if ($app -ne $Null) {
Write-Host "Uninstalling $program."
$app.Uninstall()
Write-Host "$program uninstalled."
}
else
{
Write-Host "$program not found."
}
}

Before you run any of these scripts though, you will need to change your execution policy. You can do this just for the current PowerShell instance rather than permanently allow scripts to be run on the system. To do this, it is as simple as

Set-ExecutionPolicy unrestricted

To run the script you need to include the full location of the script, even if you are running it from the current folder. For example

.\Uninstall_Revit2019.ps1

If you are specifically dealing with Revit software, you can take your uninstall a step further and clean out the associated files along with it. To do this, just add the following to the end of your script.

# Remove Revit 2019 user data.
Write-Host "Cleaning Revit 2019 User Data"
Write-Host "Backing up old user profile"
Move-Item -Path "C:\Users\$env:UserName\AppData\Roaming\Autodesk\Revit\Autodesk Revit 2019" -Destination "C:\Users\$env:UserName\AppData\Roaming\Autodesk\Revit\Autodesk Revit 2019_OLD"
Write-Host "Deleting temp files"
Remove-Item -Path $env:temp -Force -Recurse
Write-Host "Deleting user profile temp files"
Remove-Item -Path "C:\Users\$env:UserName\AppData\Local\Temp" -Force -Recurse
Write-Host "Deleting user profile Revit cache files"
Remove-Item -LiteralPath "C:\Users\$env:UserName\AppData\Local\Autodesk\Revit\Autodesk Revit 2019\CollaborationCache" -Force -Recurse
Write-Host "Deleting local machine Revit cache files"
Remove-Item "C:\Users\$env:UserName\AppData\Local\Autodesk\Revit\Autodesk Revit 2019\CollaborationCache" -Force -Recurse
Write-Host "Deleting Revit journal files"
Remove-Item -Path "C:\Users\$env:UserName\AppData\Local\Autodesk\Revit\Autodesk Revit 2019\Journals" -Force -Recurse

Just make sure that if you remove the user data as part of your script that you need to run it from that user’s profile, not from your administrator profile.

Building Healthy Asset Models

Understanding how to integrate with your client’s asset and facility management requirements at first seems like a daunting task, however getting it right can be as simple as asking the right questions.

From the perspective of a project manager or even an asset owner, some of the questions that should be asked are:

  • When will the facilities management team get access to the model? Will it be during the design phases of the project, or will it be once the design is complete?
  • Will the facilities management ream be able to dictate to the design team what information is incorporated in the model?
  • Will the facilities team be able to review the model before construction and commissioning?
  • Will the facilities management team own the model when it is completed?
  • Who maintains the model once handover is completed?

Then there are questions that the more technically minded team members tend to focus on around information requirements

  • What information does the facilities team require?
  • Do the facilities team actually need all of this information?
  • What data formats does the facilities team require?
  • Do they have existing non-BIM facilities packages that require integration into the new BIM enabled system? Can it be integrated at all?

If you’re interested in how I approached the problem of recording existing assets in one of Australia’s largest health precincts, take some time out to check out my AU2018 presentation

https://www.autodesk.com/autodesk-university/class/Building-Healthy-Asset-Models-Case-Study-Existing-Asset-Recording-BIM-2018

With a special guest appearance about halfway through from probably Autodesk University’s most famous cat, Burrito. One of the dangers of presenting remotely.

Don’t Have Dimensions in Families for COBie? Don’t Worry!

So you’re new to COBie and a deadline is approaching, your favourite project BIM manager comes up to you a few hours before the deadline and tells you “We have to do dimensions.. on every element in the COBie drop. You have your dimensions ready right?”

Well there is no need to stress, as always there is potential for Dynamo to come to the rescue. I put this one together in Dynamo v1.3, but I have tested it in v2.x as well and it still works just fine, just a note though. If you save your old 1.3 graphs in 2.x, it’s now a 2.x graph forever.

The way that I approach the majority of my COBie work is through a 3D view and a schedule with a set of very hand filters, so I’ve continued down this route for my Dynamo graph and I start it off by getting all the elements in the active view – my 3D COBie view.

Just in case there is anything in the view that isn’t a family I’m getting the element type of each element, convert that value to a string and then filter the list based on the string “family”. This is because every family in the view will be prefixed with Family Type: whereas non-family objects will not.

Once we have the filtered list of families, we need to take the bounding box of each element, we do this with an Element.BoundingBox node. Using Spring Nodes, we next use Springs.Geometry.Extents to separate out each dimension individually.

The next step is a bit of simple math. I want to ensure that the length parameter is always longer than the width, so with a few if statements and some greater than and less than nodes, the top pair of nodes always provide the smaller number that will populate our width parameter and the lower set of nodes now provide the large number which will feed into our length.

Finally, we populate our parameters with the correct information. Note that your parameters must be set correctly to type parameters for this to work, if you have incorrectly made them instance parameters the script will not work but if you’re paying attention you’ll see that the hint is in the name of the parameter.

Now, don’t forget that with the COBie element data, you should be nominating dimensions inclusive of the maintenance requirements for that object, you could always add a little bit of extra fat to your dimensions, but I would highly recommend approaching COBie and BIM in general the right way and including spatial elements that indicate the overall dimensions including maintenance access similar to what is shown in the electrical switchboard below.

 

For those that want to get stared a bit quicker, I’ve provided my graph for download below

Merging IFC Files with BIMServer

A question came through via email the other day in the office

We’ve been trying to issue federated IFC files, but the combined size balloons up to 20x the original size. Is getting these models to a consumable size a wild goose chase?

Well, the short answer is no, it’s not a wild goose chase and it is possible to generate federated IFC files that are almost exactly the same file size as the sum of the smaller files.

The long answer?

Open up this page in a browser:

https://github.com/opensourceBIM/BIMserver/releases/tag/1.4.0-FINAL-2015-11-04

Download 1.4.0-FINAL-2015-11-04.jar, save it somewhere on your local machine. Don’t use spaces in the folder names when you save it or the tool won’t work properly, use underscores if you need a space.

There is even quite a comprehensive help document with information on configuring the BIMServer tool:

https://github.com/opensourceBIM/BIMserver/wiki/JAR-Starter

There is not a lot to do in the setup process though, I just made sure I was set to local host and that there were no conflicts with my port and then clicked start. After you click start, it will take a while to run through what it needs to do, when finished you’ll see a line that looks like:

02-10-2018 11:23:33 INFO  org.bimserver.JarBimServer – Server started successfully

Now click the launch web browser button. It will take you to a window that looks like this:

For all the work that you are going to do from here on, you need to use Chrome or similar. Internet Explorer will not work. Also note that if you log yourself out, you will need to log back in with your email address as the username, not your name.

Fill out this page with your details, you’re creating a user here. You will be able to login with these details again in the future. Because this is a web application, we could actually host this internally and have a permanent server setup, it’s just this workflow is using it as a web application hosted on your local machine.

Now that you have created your user and logged in, you will now be in a project browser. Select Project -> New Project

Fill out the details as required to create your project:

This provides essentially a bucket that you store all of your data. The next step is to create sub-projects to add your IFC files that you want to merge

To then upload a model to each sub-project, you need to use the “Checkin” option. It will take a little while to import the files depending on how complex they are.

My files are between 30 – 70mb each depending on the file. They take about 4-6mins to import using a laptop with an i7 6820 CPU

You’ll know that the files are importing because the IfcGeomServer will be using CPU:

You can view your model in the web browser to verify everything is in the correct location by clicking on the eye (grey = off, coloured = on)

For the merge, you literally just have to click the little arrow to the right of the top level project, and select download.

Depending on how big your models are, the problem might be if you have enough memory to perform the merge.

If the download doesn’t work, hit the back button or refresh the main page before you try and download. You need to see an eye on the top level project for the download to work.

You can verify your IFC file either by loading it into a new project within the BIM Server web application, or you can load it into Navisworks.

My resulting IFC file is 173mb which is exactly the sum of my four individual files.

If you want to try reduce the size of the IFC file, use the Solibri IFC Optimiser

https://www.solibri.com/solibri-ifc-optimizer

After running the optimiser, the file size was reduced to 124mb.

 

 

Placing Multiple Views on Sheets With Dynamo

Keeping on the theme of Dynamo and drawing setup, I had a series of MEP models that I needed to setup that had 2 views that needed to be placed on each sheet, a main view and a smaller inset view.

I started from my sheet generation Dynamo graph and ran through a number of different options to enable to graph to place multiple sheets on views in the correct location.

The method I found posed the least problems in the process was to add an extra column to my Excel file for the names of the inset views

As a bit of ground work, I needed to figure out where I wanted my views to be located, so I made up mock sheet with the views placed where I wanted them to sit on the sheet

I then threw together a quick graph that allowed me to select the viewport I’ve placed on the sheet with the Select Model Element node and running that through the Rhythm node Viewport.LocationData I can get the centre point of the viewport box. This centre point of each viewport will be the values used when we move the viewports later.

Once the viewport locations are picked up, we can get to modifying our original sheet creation graph.

The first modification that we need to make is with the additional column in Excel. Copy the original section of the graph and change the code block to 3 so that we are reading from column D of the excel file.

The next step we filter our views again, but this time we’re filtering two separate lists of views, things get a little messy but it’s still reasonably easy to manage. Note that I dropped a List.Clean node in the mix as I was having views return with no data, the List.Clean node removed empty and null values.

Remember that if you’re going to clean the list, you need to feed the other nodes with the cleaned list, do not mix and match between clean and unclean lists, otherwise you’ll have a bad time.

Now we should have two lists of element ids, one for our main views and another for our inset views.

Our main views get fed through the same series of nodes from the moving views on sheets post.

So while all of this is happening, where the output of the Sheet.ByNameNumberTitleBlockAndViews node shoots off a second time to tell our insets what sheets they need to be placed on.

The problem you will stumble into when placing views in this method is that if the sheet hasn’t yet been created, the inset views won’t be placed. The way that I decided to handle it was by using a Passthrough node from the Clockwork package. The Passthrough node implements an order of execution. It will wait for the node threaded into the waitFor input to complete before sending on the data threaded into the passThrough input.

I fed the Passthrough node with the results of moving the main viewport on each sheet, once this main views have been moved the sheet numbers are sent through to the Viewport.Create node.

Running the script, we end up with views placed exactly where they want them. The example GIF below is recorded in real time and runs for 14 seconds from start to finish, which includes checking each sheet has been correctly created.

 

This is great and all, but what happens when you have two difference types of “main” views, one where you want placed along with an inset like the above example and another where you ant the views placed centrally?

The way I found best to handle this scenario is to add a String.Contains node along with an if node to control the location of the viewports depending on the name of the view itself.

In this particular example, the names of the “main” views are being checked for if they contain the string Platform Level_ and if they do, the views are being placed centrally on the sheet, otherwise they’re being placed offset from centre to allow for the inset view to be placed on the same sheet.

The nodes labelled platform view x, main view x, platform view y and  main view y are simply code blocks that I have renamed so I know exactly what they are.

Tagging Invert Levels

Over the years, I’ve seen a lot of err.. solutions for tagging invert levels. From adding shared parameters to your pipe families through to using Dynamo and everything in between.

Ignoring the fact that you can’t add parameters to system families like duct and pipework, there is a far simpler way to get what you’re after.

Ever heard of the spot elevation tool?

The key is in the settings that you use.

The settings that I use in my templates are as follows, the settings to change are highlighted below

In the units format dialogue, change the your settings to match the following

And of course, don’t forget to select the bottom elevation!

Practical Dynamo – Moving Views Based on Another View

Okay, so we’re on a roll with practical Dynamo usage. Last week we looked at placing views centrally on our sheets, but what if you didn’t want the view centrally placed? What if you wanted views placed in the same location on all sheets maybe in the top left of the page?

As always with Dyanmo, there is a solution for that. Again we’re going to be using the Rhythm custom node package to get the work done. This method requires one sheet to be used as a template that all the following sheets are based on.

In our example this time around, where we want the view located is in the top left (shown on the left) but by default Dynamo places our views in the bottom left (shown on the right)

 

This workflow can be easily integrated into our previous graph where we created new sheets in Dynamo using Excel however for this example we’re going to create a standalone graph. For this example though, it’s assumed that this time around though that you already have all the sheets and views required created.

 

First we start by taking all of our sheets, we do this simply by using Categories and then All Elements of Category, after that we get into our Rhythm nodes.

First we get a list of all of the viewports on our sheets using the Sheet.GetViewportsAndViews node. Run the list through a List.Clean node to remove the empty list entries. This leaves us with just the viewport entries.

Meanwhile, we also need to get the viewport from our template sheet. In this instance our template sheet will be drawing H101 which we’ll select using the Sheets node and then we’ll feed that node into the Sheet.GetViewportsAndViews node which are both from the Rhythm package.

And finally we feed our data lists into the Viewport.SetLocationBasedOnOther node, which again is from Rhythm. It’s as simple as that.

Hit the run button and watch the magic happen.

Site Extraction with flux.io and Dynamo

By now, most people in the industry would have heard of flux.io, a spin-off from X (formerly Google X). Recently, flux.io updated their site extraction tool which pulls data from free open source datasets, Open Street Map and NASA. When combining with Dynamo, it couldn’t be any simpler to pull in topography information to your Revit model.

So how do we get started with this new-fangled technology?

Firstly, you’ll need a flux.io account. Once you have that sorted head on over to https://extractor.flux.io/ Once there you’ll be greeted with a Google map where you can search for your location. The map system works exactly as you expect it to. Simply drag and resize the selection box around the area you’re interested in and then select what you want from the menu on the top right of your screen.

When your data is ready, you can open it in flux and review the results. You simply drag and drop your keys from the column on the left into the space on the right. You can pan, zoom and rotate your way around the the 3D preview although as someone that works in Revit and Navisworks all day long I found that the controls aren’t the easiest.

Struggling with the navigation?
right mouse button = pan
left mouse button = orbit
scroll button = zoom

So all of this is great, but how do you get this into Revit? It’s actually incredibly simple.

You will need to have both Dynamo and the flux.io plugin suite installed, but once you have those installed you’re only a few minutes away from generating a Revit topography.

To get started you will need to login to flux.io through Revit and Dynamo, if it’s your first time using flux.io you might have to approve the connection between Revit/Dynamo and flux similar to what you would when sharing account information with online services and Google or Facebook.

Find the Flux package within Dynamo and first drop in the Flux Project node.

Once you have your flux project selected, it’s just three more nodes. Drop in the Receive from Flux node and select topographic mesh from the drop down. From there push the flux topography into Mesh.VertexPositions and then finally into Topography.ByPoints

Comparing the flux topography in red against the professional survey in blue, we can see that the flux topography is no replacement for a real survey, we are looking at a 5-8m difference between the survey and the flux data. Thankfully, surveyors aren’t going to be out of the job any time soon. This is the case on the example site in Sydney only though, other sites are far more accurate depending on where the source data is coming from. Remember the flux data is coming from a combination of sources including survey from satellites which leads to varying levels of accuracy. You shouldn’t rely on open source data like this as your sole source of information. You should be referring to relevant site survey information to verify the data against.

The inaccuracy of the data though doesn’t mean that the flux data is useless. Provided that you’re able to reference the flux data with known survey data and adjust to suit, this provides an excellent opportunity for using the flux data to fill in missing information surrounding your known survey and site. You then have opportunity to use the data for visualisation in concept stages or flyover presentations of large sites or precincts.