Thursday, February 13, 2014

Plane Frame Analysis Front End Version 4

Now converted the variables, used in the various plane frame analysis applications, from records to classes. This version of the front-end for cpframe.exe can read a data file into the variable, then store the contents of the variables to the worksheet. It can also read contents of the worksheet and store in the variables and then write the contents of the variables to a file.

In this way existing data files (.dat) can be opened and edited in Excel. Since reading a file into a workbook over writes the contents of the cells, its not a good idea to calculate the geometry or required loading in the worksheet cells, it is better to generate the model using vba which is the point of the exercise. Using the variables and data structures now defined can write vba code to build a model directly into those variables and by pass the worksheet. The worksheet was just used to verify can read and generate the file through the use of the variables. The file format of the output doesn't match the file read, as the original data files were generated by QPro. Once I had got QPro to generate data files which pframe could read, I didn't worry too much that the file didn't match the exact format of the files saved by pframe. When converted to Excel the format for numbers was taken from the format assigned to the Excel cells, this is no longer done, the format is now hard coded into the vba code. This was done because the file read macro first clears the cells, whilst this has been modified to only clear the contents, the macro can read data files larger than the maximum formatted data area (highlighted in yellow in worksheets) and therefore some format will need assigning to these extra records. The hard coded formats also closely match the original Pascal formats: not liking some of the original formats I have changed them with out affecting the ability of the original application to read the data (thus far).

From this point forward will need to move away from buttons in the worksheet. As far as I know, vba variables have a life as long as the execution of the macro. Thus when macro stops, the variables cease to be accessible, though the memory may not be cleared. This is the reason for storing the contents of the variables to the worksheet, and then reading back from the worksheet. Though I did read recently that global variables do retain values between vba calls: maybe. Before  including the subroutines for reading the data into the variables I did trial reading then writing without storing and retrieving from the worksheet first: the two buttons seemed to work. However I put that down to valid data still being in memory, and same chunk of memory being addressed when macro next macro run, in other circumstances the data may not be valid and may address different chunk of available memory. This is based on past experience with Excel 97, though at that time I admit not overly familiar with use of the Public keyword, so past failures to initialise variables via open workbook method may have been because of not having global variables: though that seems it would have complained about unknown variables and the initialisation wouldn't have worked. Not something I really want to test or rely on: so will assume that from one vba call to the next all variables are wiped. Besides, since making the test for this application, I have removed the global variables and made local to the main class.

So on the assumption that variables are wiped from one vba call to the next it is necessary to keep the vba code active, and to do that need to move buttons from worksheet to a vba form: and have the form retain control until all tasks complete. The alternative is to repeat large segments of initialisation code on each vba call. This latter approach was adopted for the wind loading functions contained in schTechLIB, each function calls subroutines which load arrays which are then searched for data: may have to revisit that and see if I can initialise once.

Another problem to be addressed is access to the variables external to the class in which they are defined. The variables are arrays and vba does not allow public arrays in the definition of a class. So whilst converting records into classes in Turbo Pascal and Delphi is relatively easy its cumbersome in vba. An alternative is to use collections in vba, these can be public, however these are highly likely to be incompatible with the analysis routines of plane frame. {Noting that idea is to incorporate plane frame in the code not shell out and pass data file to a console application. The console application is just a development tool, and not affected by any code I write for the front-end creating the model to be passed to the frame analysis. The first task is to auto-generate a valid structural model of a manufactured  structural product (MSP): not analyse it, that task comes later.}

If don't use collections then need to write property (Let, Get, Set) methods to access the individual elements of the arrays. At present only written one of the classes in Pascal, and vba, and use of collections keeps vba most similar to Pascal and

So as to which is the better option I will work out as I attempt to gain access to the variables and define a structural model.

Structural Design
At the moment I assuming that structural design is split into three stages:

Consists of creating the structural model:

  1. Dimension and Geometry
  2. Design Actions

Stage 2
Consists of Analysing the structure and determining the reactions and action-effects. The analysis used depends on the analysis used. The tools used for the analysis depends on the complexity of the analysis.

Stage 3
Consists of:

  1. Component sizing
  2. Connection Design
  3. Footing Design

However the whole process is not all that simple, as connections and footings themselves have their own components and may require repeating stage 1 to stage 3, using different analysis methods suited to the details of the component considered. That is steps 2 and 3 of stage 3 can be removed and these components considered as requiring their own structural models. {eg. connections modelled in isolation along with applied forces, using finite element method}

So cpframe only covers stage 2 for structures which can be modelled as plane frames. The front-end for cpframe covers stage 1, and the back-end covers stage 3. There is plenty of software available for stage 2 and stage 3: though not much choice for software covering Australian materials codes. It is stage 1 however where the problem lies with respect to rapid design of manufactured structural products (MSP). For simple MSP's a few simple calculations in a a spreadsheet will suffice for all stages, for more complex structures some analysis engine is required for stage 2. If using an analysis engine then a structural model needs to be built from simple user input. For structural frameworks the form of the structural model is relatively similar, for example I only need to add a few extra fields, add WriteArc methods to all the classes to export the model to a MicroStran .arc file. These methods I already have in classes specifically written for writing MicroStran files. {Pity I think MicroStran even needs to import the arc file. Otherwise could launch that instead of cpframe.}

So given that the data structures required to represent a structural model are relatively similar I can create my own data structures and use to create models for export to what ever structural software I choose. Note not concerned with picking up the dimension and geometry from a CAD model and adding loads to it: the dimension and geometry are to be auto-generated from a few simple parameters and no other variations permitted. If want other variations then export a model for import into general purpose structural analysis software. An MSP has a defined structural form and a limited set of parameters: if the structural form changes then it is no longer the off-the-shelf MSP. I don't see the purpose of software to permit design at point-of-sale (PoS) but to constrain the options available so that the customer knows they have just defeated the entire point and purpose of going to a supplier of MSP's: and consequently some significant period of design and engineering is not required. On the other hand want to make more customisable options available at the PoS. So that eventually get auto-generation of basic options and then manually edit more custom options: with the allowable editing constrained. {NB: Some advanced level CAD systems for more than 20 years now, have been able to define dimensions by mathematical expressions. For example the span can be set to twice the height. Plus various nodes or points can be used as constraints and references. However that doesn't necessarily equate to being able to increase quantity of components, such as varying the number of portal frames with the length of the building. Also in the building industry they don't pay to much attention to assemblies and sub-assemblies: so the building rather than being a series of portal frames becomes a forest of columns with a collection of rafters. Thus 3D modelling can change the perception of the structure: getting away from the sub-assemblies which make it up.}

Any case the focus at the moment is auto-generation of the structural model, and ultimately being able to analyse that model using various tools to get comparative checks on structural adequacy, so that independent technical checking is viable. In the past and even now, independent checking is a problem, the manufacturers use propriety software generate and check compliance in a few minutes and the city councils engineer certifying the structure may require a week to check the design using general purpose tools: or otherwise simply reading through the reports produced.

So my view is to provide the tools to the certifiers and general designers first for use with general purpose structural analysis tools. This in turn increases the potential for MSP's in the first place, and with MSP's comes the manufacturers desire to limit what the sales team can sell to keep the production economical. So go from the flexible to more and more constraints.

Download frontEndPFrame04.xls .

Users of the software must accept this disclaimer of warranty :
The software is supplied as is. The author disclaims all warranties, expressed or implied, including without limitation, the warranties of merchantability and of fitness for any purpose. The author assumes no liability for damages, direct or consequential, which may result from the use of the software.

[13/02/2014] Original

[23/04/2016] Changed download links to MiScion Pty Ltd Web Store

Tuesday, February 04, 2014

Future Development of our Plane Frame Program

It is not the intention to write a stand alone frame analysis program. Software such as MicroStran and Multiframe serve that purpose reasonably well.

Similarly it is not the intention to write software supporting a building information model (BIM). Software such as ArchiCAD, Revit Structural, RAM Build, Tekla (XSteel/XEngineer), Master Series Master Portal serve that purpose.

None of this software however is suitable for manufacturers of structural products who do not employ engineers on staff. Employing an engineer on staff is not overly viable: one a shortage and second the problem of a graduate gaining appropriate experience and guidance to do the required job in such places. Also civil engineers are not overly suitable any case they don't have the repetitive manufacturing and product development knowledge. They typically design buildings one at a time for a specific purpose to comply with one set of regulations. Local regulations are irrelevant when designing a product, the product may have to meet regulations in multiple regions with minimum variation to zero variation in its form. On the other hand mechanical/manufacturing engineers lack the building structures knowledge. More importantly an engineer employed on the staff of these manufacturers is likely to get bogged down dealing with the day to day hassles of cleaning up the mess created by the sales people.

The sales people and their customers need to be enabled and empowered to make the right decisions to keep their company out of trouble and avoid hassles for the customer. I've trialled tables: they always want smaller increments to the tables and usually want to opt for the unconservative direction when making a choice. I've tried graphs: there are unusual and incorrect ways to read simple graphs. Also a single graph or table doesn't specify the requirements for a complex assembly, multiple such design aids have to be used and so complicating the process. Hence software is seen as the solution by the manufacturers and they typically seek an Excel spreadsheet. An Excel spreadsheet typically because they know of something similar used else where or because they know we and others use Excel for our calculations.

Spreadsheets with calculations directly in the worksheet are not however a sensible option. It can involve a considerable amount of work just to add one additional structural element. Using arrays or records such additional structural element can be done with relatively minor effort. Therefore using a programming language like vba/Pascal and a database like MS Access or sqlite is more efficient use of computing resources. Efficient use of computing resources potentially becoming important again as the use of smart phones increases, along with Internet services.

From such perspective Java and JavaScript may be considered the more preferable programming languages. As these are more readily available programming languages for the Internet and Android phones.

To convert either the vba or Pascal source to, I have to convert all arrays to zero based indices. Since Java is largely C like in its syntax I assume it also requires zero based arrays. Therefore could adopt Java in preference to, however Java is likely to impose more constraints on the use of classes. Further as with Lazarus, Java has the problem that it isn't immediately compatible with the .net framework: and writing either a COM automation server or .net component is going to be more complex than with Visual Studio and Since I would like to produce something which can be used in either Excel, MS Access or referenced in some other application, COM or .net is important.

At present however it is easier for me to convert the Turbo Pascal source into using class/objects and to using zero based arrays. To do this in parallel with further development in Lazarus and vba. At the same time modify the program so that it is based on constructs which make it easier to translate to Java and C#. Where C# provides compatibility with the .net framework and Java with Android: with the two languages being about as similar as and vba.

The approach may be slow, but it will permit the modules of the intended larger application to remain functional in at least one programming environment at any point in time.

pframe needs a back-end for member design for steel, cold-formed steel, timber and aluminium members. Currently this is only available in vba. Connection design only available in Excel worksheet calculations.

Developing the back-end however I consider to be off secondary importance as such facility to a limited extent is already available in software like MicroStran and Multiframe.

The feature missing from most of the commercial software for the Australian market is auto-generation of the structural model. Sure the software can generate the dimension and geometry for a few common structural forms: but the software doesn't auto-generate the structural model with the loading. Nor is the software able to iterate through various structural models subject to some constraint: such as increase height until some maximum moment is reached. {Though Multiframe is a COM automation server and can be programmed for such task}

With respect to auto-generation of a structural model in the form of data files for pframe and MicroStran that is already written in Delphi and vba. Converting it to is expected to be relatively easy. However having auto-generation as an integral part of pframe would be a lot better, rather than interacting only via data files. Since converting pframe to is the obstacle, merging the Turbo Pascal and Delphi source seems the fastest approach: other than the wind loading is obsolete. The wind loading however really needs modifying so that it gets data from files rather than hardcoded arrays: for which purpose XML files maybe suitable.

History of the Development of our Plane Frame Analysis Application

The earliest dated Pascal source files I can find on our systems are dated 1990, however I didn't start using the program until the 1996 version. The 1996 version uses Turbo Gadgets to provide walking menus, and it otherwise uses the full graphics screen to display moment diagrams etc... This I wrote about in a previous post on my blog: since its a DOS based program using the full graphics screen its not overly compatible with windows: works Windows XP but not windows 7.

Back in 1996 the program was the only frame analysis program Roy Harrison & Associates (Now MiScion Pty Ltd) used. A few years later we purchased a MicroStran license, and a few years after that, rather than get a 2nd Microstran license we got a Multiframe license as a comparative check. Once we had the commercial software, this simple plane frame program fell out off use. {The application at this time had names alternating from f_wrk.exe to frame.exe}

Over the years however we have been repeatedly asked if we can supply software to help our clients sales personnel customise their structural products at point-of-sale (PoS). None however were really serious about investing in the development of such software: just an hopeful expectation that since we did everything by computer and we wrote the tools for such, that we could just throw something together. Not that simple.

The program as released here (cpframe), is the console(c) version of plane frame (pframe), I have removed the user interface. I never really used the user interface of the original program: this is because I wrote and used Borland Quattro Pro (QPro) macros to generate the data files. Our original projects involved standard calculations for sheds and carports, subject to a set of conditions but with a requirement to find the maximum height possible. Since wind loading is dependent on height, I needed to calculate new wind loads for each height until I found the limits of the structural section being used.

I could have wrote additional Pascal code and merged with pframe to achieve such objective, however already had QPro spreadsheets for general wind loading calculations, it was therefore faster to write QPro macros to generate the data files for pframe.

Similarly could have wrote more Pascal code to integrate coldformed steel design AS1538/AS4600 into pframe, but once again already had QPro spreadsheets for general design of members and connections. In terms of solving clients current problems had no time to develop a full application in Pascal.

Now whilst QPro could launch pframe, pframe didn't support command line parameters. Therefore I had to manually open the data file and run the analysis, then import results in QPro. Since pframe only supports a single loadcase, I had to repeat this procedure for each loadcase. Pframe was thus a bottleneck in the whole process that I wanted to remove. {This bottleneck was slightly improved once we got MicroStran and I modified the QPro macros to generate microstran .arc files. Microstran could deal with all loadcases and envelope the extreme values, thus reducing the total number of steps compared to using pframe.}

Back in 1996 however I had an aversion to touching the Pascal source code for pframe written by my father and adding the desired command line parameters or otherwise expanding the program. The aversion stemming from not having any simple and rapid means of testing the program to ensure I hadn't messed up the calculations.

So I needed an alternative method. Whilst moment distribution is easy enough to learn for continuous beams, its an otherwise cumbersome approach for plane frames (though extremely practical when there isn't anything else). American texts tend to show moment distribution calculations littered about a diagram of the frame whilst others present a more tabular approach. Either approach seems messy, and prone to error. Since I had a steel designers manual with Kleinlogel formula for the frames I was most interested in, I adopted Kleinlogel formula.

Still using Kleinlogel formlua is time consuming and prone to error, so I set up a spreadsheet to do the calculations. Spreadsheets being slightly cumbersome however, I decided to also program the calculations in Delphi 3 (Pascal).

Qpro macros for AS1170.2 wind loading
Parallel to this was the desire to expand the wind loading calculations and make them more complete and not just limited to the few scenarios most often encountered and manually handling other scenarios. Since more familiar working with arrays in high level languages like Fortran, Pascal and C it seemed easier to develop the wind loading functions  in Delphi (Pascal) than in the QPro macro language. So wind loading calculations were thus developed in parallel in QPro and Delphi, one used as a check against the other.
Now the problem with QPro macros is that the calculations are not up to date unless the macros had been run, and these were either run when the dialogue boxes collecting data were closed or when an hot key was used. This made the spreadsheets slightly cumbersome, but I had read somewhere that there was potential to create a dynamic link library (DLL) and add functions to QPro and this seemed possible using Delphi. Though I had't read in detail how to do so, and it seemed complex anyway, the potential was there. Hence the parallel developments in Delphi and QPro were not considered wasteful as they were expected to merge at some point in the future.

However, whilst wandering around book stores during my lunch break whilst working on contract in the city, I bumped into a book on Programming Office 97 using Visual Basic for Applications (vba). I had read some articles in computer magazines about Excel/vba but wasn't sure how vba related to Excel, plus I had a bias towards Borland software. Still I bought the book.

I had been hoping that Borland would merge the IDE's for Turbo Pascal, Turbo C and Turbo Basic and ensure they had the same function libraries, had built in language converters, including converters to turn GWBASIC into simple QPro spreadsheets or Paradox applications, and further more would be the programming languages for Paradox and QPro. They kind of did this, they threw Paradox and QPro away and developed Delphi and C++ Builder. Corel QPro I didn't like it was buggy. However, it was our 2nd QPro for windows license, and time was being wasted modifying my Borland spreadsheets to work in the Corel version. I didn't want to solve that problem by getting an additional Corel license. I was looking for an alternative spreadsheet, after reading the book on vba and office 97, I went and bought Office 97 Professional and the systems developer kit (SDK).

The wind loading functions I had programmed in Delphi, I translated into vba, thus making them available as functions in the Excel worksheet. A simple change to a spreadsheet cell and all the wind calculations upto date and the frame analysis using Kleinlogel also completed. Manually changing parameters in the spreadsheet I could quickly map out the potential of all the available c-sections for use in cold-formed steel sheds.

But still had a problem. Translating the dialogue boxes from QPro to Excel 97 wasn't so easy. Connecting the dialogue boxes to cells in Excel seemed cumbersome compared to QPro, I may have been doing it wrong but I had no real references for such. I tried the QPro way and that didn't seem to work: a similar approach does work in Excel 2003. Though there is still an issue of being able to abandon the dialogue box and not automatically updating the worksheet: such was not a problem with QPro1. Besides it appearing cumbersome to allocate data to drop down lists on the dialogue boxes, there was another problem with Excel and that was timing. There seemed to be a timing problem between getting data from dialogue boxes, evaluating user defined functions (UDF) and updating worksheet calculations. Either crashing or simply not calculating the correct answers.

Initially I had tried to replicate the QPro like wind loading macro's making use of the worksheets to store the data tables, but that appeared to be part of the timing problem and therefore I decided to abandon that approach in favour of using arrays in vba. Due to the problems with dialogue boxes, I abandoned them in favour of setting up input forms fully in the worksheet. Once the scope and life times of vba variables were better understood the workbooks worked fine.

But due to the problems encountered with programming vba, development continued in parallel in Delphi. I did attempt to iteratively change the structure height in an Excel worksheet and capture the resultant maximum frame moment and tabulate them. But a clear timing problem in that situation: the height can be changed faster than the worksheet can update the calculations. Incorporating delays could have probably fixed it, but why incorporate delays when objective to get calculated information as quickly as possible. Hence the Delphi program was expanded to iterate through the heights, or through the spans or through both heights and spans and calculate a table of maximum moments.

I then decided to produce a height/span chart and charting in Excel seemed easier than using Delphi graphics. Using Excel I could produce and store tables and charts and any other reporting may want, further more I could control Excel from Delphi. Unfortunately the Excel type library didn't import properly into Delphi due to keyword conflicts. The consequence was that programming Excel from Delphi was being done blind. Bit of a problem as Delphi uses [] for arrays and () for function parameters whilst vba uses () for everything. So that part of the application also needed parallel developments in Excel and Delphi, test what I needed to do in Excel then translate to Delphi.

This however was interrupted by changes to the wind loading code (AS1170.2), and since developing the application was a side line to day to day structural design, it was more important to update the general purpose wind loading Excel workbooks rather than update wind loading in Delphi. As a consequence Delphi was abandoned for developing the height/span charts, and it was all written in Excel/vba, all calculations in vba and thus avoiding problems of timing with worksheet calculation up dates.

Since we had been going down the path of developing a stand alone application in Delphi, pframe was part converted to Delphi to create a windows application with the graphics for the moment diagrams added, but without the rest of the interface developed.

However due to all the member design (AS4600/AS4100/AS1720) all being written in Excel/vba only, and the wind loading up date issue, it was considered that move over to development in Visual Basic might be more productive. So we obtained Visual Studio (VS) 2003 and, and then to get a second license we ended up with VS 2005.

But spreadsheets are still easier to format reports in, than messing around programming Delphi or Sure those who prefer MathCAD type presentations think otherwise: that Excel is poor for presentation. For some reason there was some resistance to pushing forward with Delphi or development because of resistance to plain ASCII text files (.txt) or rich text files (.rtf), and complications of print preview and getting things to printers. But none of that is really a problem with MS Windows as notepad(.txt) and wordpad(.rtf) are available on each machine. Sure there is a possibility that the user can modify the results: but they would have difficulty proving and replicating such contended error in the program. Further today results are typically sent to pdf files rather than paper print out: and the pdf files can be edited.

Which is another point we had started to trial, generating pdf files, and produce all electronic documents early in the 1990's, but it was cumbersome to use pdf files where we needed results for further calculations. It was far easier to use paper printouts and mark up the required data. Scanning handwritten calculations to incoporate with computer generated calculations also produced massive pdf files, and so electronic documents were abandoned, we didn't have large enough hard disks to store the stuff, and zip disks were expensive. Hence further reason to integrate all the calculations electronically: eliminate the paper print outs for reference and produce smaller pdf files. turned out to be significantly different from vba, and therefore it was put aside and pframe was converted to Excel/vba (xlFrame), so that it could interact more directly with Excel worksheet for input data and reporting.

Not long after doing that we were approached to provide a structures module for a carport/verandah spreadsheet. Whilst the structures module is relatively small the spreadsheet itself is relatively large, much of it is data and could probably be better done using MS Access: which is another development track pursued along with Paradox with respect to the materials management.

Now the Delphi application besides using Kleinlogel formula to generate height/span moment tables, it also generates data files for pframe, MicroStran (.arc) as well as AutoCAD LT scripts (.scr), it can also read data files from various other programs written. Much of this has been converted over to Excel/vba and extended further but as separate Excel workbooks. Attempting to gather a lot more vba code together into a single integrated applicataion hit some limit of Excel/vba. Whilst the code seems to run ok, its not possible to work on the modules as attempt to close/save the file it hits a memory limit and basically crashes. It won't save the file except through its recovery feature with all the vba code removed.

Since I don't consider I should use a better computer with more memory, nor that I should reduce the number of vba modules, further development in vba has stalled. Leading me to revisit

The expectation was that could simply change the xlFrame into a COM automation object or .net component or similar, which can be plugged into Excel. Then all the parallel developments would disappear as all my wind loading and member design function libraries and the plane frame analysis could all be in and possibly a single library. Unfortunately that prior problem of the differences between and vba makes such conversion difficult for the plane frame analysis: though a simple conversion for the function libraries.

Also I want software like the carport/verandah software to export data files compatible with pframe and also to generate MicroStran arc files. When testing the data files exported by xlFrame they were not compatible with the 1996 version of pframe. This led me back to looking for Turbo Pascal source code to compile the original 1996 version of the program, and trace why the new data files were not compatible. Finding source code which compiled and used the same data files as the operational exe file, and which also produced correct results wasn't so easy.

The change in the file format was attributed to a change in how partial loads were defined. The error in the calculations was tracked down to dynamically allocated variables being freed from memory before results stored in those variables was actually used.

So having gone back to Turbo Pascal and given I prefer object pascal compared to vba, especially with respect to arrays in classes, it does seem that further development in Pascal may be the better option than, with Lazarus being a viable alternative to Delphi. Though a COM automation server may not be so easy to develop in Lazarus as it is in Visual Studio.

Anycase at the moment maintaining parallel developments in Turbo Pascal, Delphi 3, Lazarus, (VS 2005) and vba as I convert the record data structures into class/objects. The main difference at the moment is that an array has been converted into a collection in vba, though I may convert that back into an array and write property functions to access it. Both these vba approaches seem cumbersome compared to the other languages.

Plane Frame Analysis: Alternative Front End

Created an alternative Front-End for Plane Frame analysis. Instead of a single worksheet with all the data required by cpframe the data has been split between multiple worksheets. This makes it easier to add extra data records for each data set. This also makes it simpler to read an existing data file into the workbook, which may be useful if an auto-generated data file doesn't appear to be producing the correct results or if cpframe cannot read the file.

As primary interest is auto-generation of the models using vba and other programming languages rather than building in the worksheet, the next front-end I will release will read the data into appropriate vba data structures, with facility to save the data to the worksheet or retrieve from the worksheet. Similarly I will write a back-end based on similar data structures used by cpframe to write the results in the first place.

As I convert cpframe to a Windows console application, I will also add an option to read and write directly to MS Excel. {NB: Currently cpframe is MS DOS application and only supports 8.3 file conventions. In converting to MS Windows the intention is that it stays a command line console application.}

The file for the alternative front-end is:


Users of the software must accept this disclaimer of warranty :
The software is supplied as is. The author disclaims all warranties, expressed or implied, including without limitation, the warranties of merchantability and of fitness for any purpose. The author assumes no liability for damages, direct or consequential, which may result from the use of the software.

[04/02/2014] Original

[23/04/2016] Changed download links to MiScion Pty Ltd Web Store

Plane Frame Analysis : The Back End

As mentioned in discussing the Front-End, the development of the back-end for frame analysis is of secondary importance, as it has been relatively well catered for by the off-the-shelf structural analysis software. For example software like MicroStran and Multiframe have two options concerning sizing of structural members, and these are:

  1. Check
  2. Design

The "Check" option carries out an assessment as to whether the currently set structural sections are adequate for the action-effects generated by the applied loads, and otherwise gives some indication of the efficiency of such sections. The designer can then adjust some sections in an attempt to get more efficient sections, and run the analysis again and check adequacy. The designer can repeat this until they are happy with the result.

The "Design" option automatically finds the most efficient section for each member, the designer can then opt to use these sections or not. Typically adopting the results of using the "design" option is highly impractical. Consequently the results of the "design" option are just used as a guideline for manually changing some of the members but not all of them. Therefore some human interaction is required to reach the final design selection.

Additionally little of the available software has integrated options for connection design and footing design, this is typically all done external to the program. Also as previously mentioned the 3D geometric model is not necessarily a valid structural model, and therefore there are other components designed external to the analysis software. Some 3D structural analysis software explicitly optimised for buildings, allows components to be modelled graphically but excludes them from the 3D structural analysis and treats the components separately in the manner most appropriate for such components. Thus allowing everything to be modelled visually but with out creating an invalid structural model.

For manufactured structural products (MSP's) typically expect:

  1. Reduced parameter set
  2. Reduced set of components

For example cold-formed steel sheds and carports made from c-sections and employing rigid portal frames, typically have such frames at 3m centres. The frame spacing is thus one parameter which is rarely changed and when it is it is usually reduced. This is because the C75 typically used for girts and purlins is barely capable of spanning 3m: however it wastes little floor space compared to larger sections fastened to the face of columns. The roof pitch is also typically locked. These things often need to be varied but are not in the scope of standard calculations typically held by suppliers, hence the desire for software to allow such variation.

With respect to a true MSP, there is no real need for structural analysis software. Often the issue of analysis versus lookup tables arises, with look up tables being considered inefficient. It is incorrect to conclude that look up tables or databases are absolutely inefficient for the task. In fact there is a good chance that structural analysis is the wasteful inefficient option.

If have something like the span tables of the timber framing code (AS1684) or the girt and purlin load capacity tables for c/z-sections: then for certain automating looking up values from such tables is likely to be inefficient if it is based on simply building a database containing the printed tables. Similarly it would be inefficient to place the span tables for steel carports and verandahs into a database. If look at cold-formed steel sheds then the manufacturers typically have an ad hoc random collection of standard calculations with no rationale behind them: the calculations are of little use to anyone other than for annoying the customers with a failure to have anything compatible with their needs.

Rather than a database of values, the real need is for a database of constraints which can be readily attached to the input boxes on data collection forms. The MSP's are meant to be pre-engineered, thus all the engineering is expected to be done already. The engineering can therefore be used to define constraints and associated solutions and the software can therefore run a lot faster. In other words instead of searching through all the available structural sections to find one that works, from the very start already know what the minimum suitable section is. We know the capabilities of the smallest available section, and also know the limitations of the largest available section. So it is not necessary to carry out structural analysis at the PoS to identify that a proposed building is beyond the scope of the typical cold-formed steel structure, and requires significant custom engineering to make it feasible using cold-formed steel. {eg. We recently designed 32m span x 8m high using Fielders largest Dimond sections. The section is not adequate in it's own right and therefore had to be reinforced with steel flat bar and folded channel. Therefore feasible but not something would get the salesperson to do whilst chatting with the customer at PoS. This is not a Fielders shed, it is just using their materials.}

The database doesn't need to be massive. Further if talking about large databases, then the structural drawings and structural model especially if in 3D represent an extremely large database. Whilst the analysis of a 3D structural model is typically very fast, the automatic sizing of the members by the software can be painfully slow. The earlier versions of MultiFrame for example were extremely slow compared to MicroStran when it came to running the "design" option: now they are about the same, with MultiFrame having got faster. This I expect had more to do with MultiFrames complex multi window user interface than with the algorithm operating behind the scenes. So opting for analysis does not reduce the size of the database, nor does opting for look up tables increase the size of the database. The structural product needs to be looked at carefully, and if it hasn't really been designed that's going to be difficult.

For example each available cold-formed c-section has a maximum achievable envelope which it is suitable for when used for a simple gable frame. Once that section has been selected for a proposed building the connection components and footing system are also largely determined. Therefore only really need to know what the defining envelope is for each c-section. A simple data input form can then automatically update based on constraints in response to simple inputs. Depending on the structural product could all be done by an extremely simple and small Excel spreadsheet.

However all the engineering for the product needs doing first before any such constraints are available and the building industry is not really into being proactive and designing a product to satisfy the needs of a market, it is instead highly reactive only responding when it bumps into and trips over the customers needs. On the other hand if they did decide to be proactive and went to a consulting civil/structural engineer to get a MSP designed they would bump into a series of problems: that's why the manufacturers typically hold a random collection of structural calculations obtained on an as needs basis. An infinite number of points along a line segment of any length, leads to an infinite number of standard designs being required, which is not practical therefore seek software so that parameters can be varied on an as needs basis. Most manufacturers however are too small to pay for development of such software, and also seemingly too small to pay for product development.

My view however is that they could pay for product development if they employed engineering associates on staff and made use of off-the-shelf software. They develop the product in small steps and otherwise provide higher quality service to their customers, by having engineering capability on staff rather than hoping some external consultant is available at the time required.

If focus on product development and having a product available which meets the needs of the customer, then the PoS software can be kept simple and all the design and engineering done in the back-room prior to customer enquiry. The real objective is to predict accurately, what the customer wants, and have it available already, not ask them what they want, and supply at some future date.

Therefore the back-end of frame analysis is of secondary importance as there is now a diverse range of structural analysis software available which can be used for sizing members. Where little effort has been put is auto-generating the structural model: with geometry and loading. This is because the focus for high end software is dealing with geometry which comes from an architect and having to transform this into a structural model.

For MSP's we are only concerned with the structure, therefore more able to generate geometry and loading. The importance of this is that at point of generation we know that a certain structural element is in the roof or the wall and therefore know what loads to apply to it automatically.

For the architecturally defined geometry, do not know that a beam is in the roof, unless it carries additional data which can be interrogated, so that can apply the correct load to it. CAD operators find putting lines on the appropriate layers cumbersome, and commands designed to ensure that entities have the appropriate layer and other attributes even more cumbersome. So the possibility that all elements in a building information model (BIM) are all tagged correctly to allow automation tools to work correctly is relatively low.

For a MSP however everything is supposed to be predefined, and therefore we have far greater potential to auto-generate the structural model. If we can do that then there is plenty of software available for what I have labelled the back-end of frame analysis. Developing a back-end is therfore not something I wish to give priority to, as all this other software provides the needed independent check on the design of the structural product. I have MicroStran and Multiframe licenses explicitly for the purpose of checking one against the other. Most of the time only use one package, but when strange things occur then build models in both packages and check one against the other and hunt down causes of variation if any.

With an auto-generated structural model and a large variety of software available to carry out the frame analysis and size the members, there is reduced potential to question the validity of a manufacturers MSP, as there is potential for a large number of independent checks. The structural model is not hidden in some obscure software owned by a manufacturer. Further the suppliers of the general purpose frame analysis software will be under increasing pressure to further develop the back-end capabilities of their software as their software will be the ultimate bottleneck in the whole process. So why expend effort re-inventing the wheel? These software developers already have 80% or more of what is required for the back-end of frame analysis, let them add the missing features.

The current major bottleneck is building the model for use in the available software when it comes to the common structural forms of the MSP's. However some manufacturers may be better served by a stand alone structural analysis package with integrated back-end highly customised to a specific MSP.

Simple Back-End
Therefore to provide for experimentation with the back-end of frame analysis I have thrown together a simple MS Excel template. The template just has a single button which reads the results file generated by cpframe and writes them into the cells of a single worksheet. Once the results are inside a worksheet, they can be linked to other cells which are used for the sizing of members, the checking of connections and the sizing of footings. To deal with multiple load cases however it would be better to read the results into an array and process all the load cases using vba. It would generally be preferable to avoid performing calculations in a worksheet unless a relatively tidy presentation can be developed: as the number of structural members and load cases increases, such becomes increasingly impractical.

I see both the front-end and back-end being developed entirely in vba or other programming language. Whilst it is possible to do the calculations in the worksheet it just becomes increasingly prone to error and a nightmare to manage. Why repeat a calculation in 10 cells by copying one cell, when can write the formula once and place in a loop and be sure all 10 calculated results are based on the same formula at all times. Copying cells is prone to unexpected changes in cell references. Such changes may be easy to spot after the fact, but not always fresh in the users mind when they are copying the cells.

Worksheet calculations are useful for checking the operation of vba code and otherwise testing and breaking vba functions by attempting to supply invalid data. For example testing a function for the case of division by zero, has it been covered? What other inputs can break the function? All easier to test grabbing input parameters from the worksheet. Whether actually does work when called from vba is another matter, as the features available to handle errors in a worksheet cell are not valid when executing solely with in vba.

Anycase the back-end template is:


Users of the software must accept this disclaimer of warranty :
The software is supplied as is. The author disclaims all warranties, expressed or implied, including without limitation, the warranties of merchantability and of fitness for any purpose. The author assumes no liability for damages, direct or consequential, which may result from the use of the software.

[04/02/2014] Original
[23/04/2016] Changed download links to MiScion Pty Ltd Web Store

Monday, February 03, 2014

Plane Frame Analysis: The Front End

The Plane Frame analysis command line application previously released, is typically expected to be used for developing and testing a larger application which would comprise of the following:

  1. Front-End which auto-generates the structural model (the data file for cpframe).
  2. cpframe
  3. Back-End which obtains the results from cpframe and then uses to size members, check connections and size footings
My primary focus is on the front-end as there is plenty of structural analysis software out there which already integrates the structural analysis function of cpframe with the back-end functions of member sizing, though little software available for checking connections and sizing footings with in the structural analysis software. The available structural analysis software also typically has facility for auto-generation of the dimension and geometry for common structural forms. Very little of the software however will auto-generate the model with loads applied: and that which is available doesn't cover Australian standards.

With respect to manufactured structural products (MSP's) there are at least 3 issues for structural design software, and these are:
  1. Rapidly analyse and specify structural requirements and generate cost for at the point-of-sale (PoS) and to be used by sales people and customer.
  2. Provide adequate documentation for seeking development approval (building permits)
  3. Aid independent checking and/or testing
The latter issue is important because without adequate consideration it becomes necessary to generate an unnecessary amount of documentation for seeking development approval for each and every project. MSP's are made in large numbers, if the structure was adequate last week, and barring any major changes in codes of practice, then it will be adequate this week and next week. To churn out documentation in the form of a thick report for each and every project becomes silly. The report needs to be kept, as far as possible, to a single page certificate: but we need independent means of checking the validity of such certificate.

Since most software for rapid design of MSP's is restricted to use by the manufacturers and their sales agents, no consulting engineer charged by local councils to do an independent check can rapidly generate a design model for use in general purpose structural software. Checks would involve the following:

  1. Relevance of the software or structural model to the proposed building structure
  2. Validity of the input parameters to the model
  3. Validity of the calculated results
Similar checks are also required  by consulting engineers employed to check and certify the software. However whilst such software may require to be certified, such certification doesn't mean the software is relevant to the projects it actually gets used for in its day to day usage. The authors of such software also need to be conducting such checks and tests.

To my mind it has never been a simple matter to just dive straight in and start writing software for a manufacturer. The first stage should be developing tools which assist with the independent testing through the use of general purpose structural analysis software. To start with the engineers typically asked to provide such software usually start out by providing the engineering, so they need tools to provide rapid feedback to their clients. {NB: This not the case of designing a building for an architect and come back in a week, the feedback needs to be in around 30 minutes. In 24 hours the customer could have gone else where got a price for a building and placed an order.}

Unfortunately the manufacturers typically want to start putting software to use immediately, and once they do they will start hitting into projects which are beyond the scope of the software. The engineer will then have to deal with these variations in rapid time frames. So this situation reinforces the need to have an interface to general purpose structural analysis tools, so that the custom variations to the MSP can be easily handled externally to the manufacturers software.

Now given that MicroStran is popular general purpose structural analysis software, and its text based '.arc' files are a common import option with other software, then auto-generation of MicroStran '.arc' would the more productive option to adopt for the design office.

However developing a front-end to auto-generate structural models for MicroStran is not useful with respect to developing the software required by the manufacturer of MSP's, as MicroStran is not available as an analysis engine for integration into other software.

Therefore need to pursue parallel paths:
  1. Auto-generating models for a structures engine
  2. Auto-generate a model compatible with general purpose structures software.
Since we have MicroStran and MultiFrame licenses I will be developing auto-generation of models for these two packages along with models for our structures engine (cpframe). In the first instance I will focus on models for use with cpframe, as engineering graduates have the greatest potential for writing model generators and they don't necessarily have access to the commercial software packages.

It is also to be noted that most consulting engineers have little interest in auto-generation of structural models, as they mostly work with architects and have to fit a structure to a building, where as with MSP's the building is fitted to the structure. So most consulting engineers will be increasingly moving towards use of building information models (BIM) as such becomes more affordable and practical. However BIM forces the use of 3D models, and 3D structural models introduce a multitude of problems with respect to realistic modelling of the real structure. There are components in building structures which cannot be modelled correctly in general purpose 3D structural analysis software. If these components are left out, say girts and purlins for example, then no longer have a single 3D structure but a series of isolated plane frames. These 2D plane frames are largely the same, therefore wouldn't waste time modelling all of them, just the most heavily loaded frame and make them all the same. This is important for PoS software, as the computer has to wade through all available components and find the ones which work: where as an experienced designer would start off with a reasonably good guess and would only need a few iterations to find the best solution from the available components. In short at present BIM is too expensive, and the 3D graphical model has little relationship to engineering models across all disciplines, and a lot of extra work is imposed for no benefit. Any case for these engineers auto-generation of the structural model is of little value, as the primary requirement would be to auto-generate design actions applied to the dimensional and geometric model created by the architect. For such situation, idealistically want the structural model, the design actions applied, as the architect designs the building. For example the architect inserts a floor, defines its purpose and the floor structure is automatically generated. The structural engineers task being to tweak the model and advise the architect of changes to make. Ultimately the architect should be able to remove a column and the beam supported would turn red identifying as no longer suitable and in need of re-sizing. That is 80% of the expert advise the architect needs would come from the software the remaining 20% from specialist consultants. It is to be noted that often architects have difficulty finding engineers who are capable of realising the proposed the building, and so the buildings get reduced to the pinned and braced boxes which are in the capabilities of the available engineers. {NB: Whilst an engineer may have studied some theory or be able to pick up and read literature on new theory or rarely used theory, it doesn't mean they are confident or willing to apply such theory in the real world. So in the first instance computer software allows those who can to enable those of lack the confidence to go it alone. Such software however provides a foundation for enabling and empowering everyone, instead of building physical prototypes and testing, they use the software as the test bed.}

So with MSP's we are dealing with standardised structures and the proposed building has to fit the available MSP. The structure therefore takes precedence over the building design, the designer has to fit their building into the envelope provided by the MSP. The designer is typically the owner who has decided that an MSP is suitable for their purposes, and who otherwise wants to avoid the delays caused by architects and engineers. Unfortunately they typically miss the delays which will be caused by expecting a sales person to provide design for the custom features the buyer desires. An engineer being called in at the last minute when everything has been rejected by the local city council. Since the engineer can be any available engineer, such engineer first has to get up to speed with the manufacturers product. Which will turn into a hassle as the manufacturers don't really have product specifications, or spew forth a lot of nonsense about intellectual property (IP). If they had any real IP then they would have patents, and if they have patents, its public knowledge. In the main they have no real product, no product manager, and no product development manager, and an over willingness to sell what ever the customer asks for. Hence their desire for software to do the design to the customers requirements at point of sale: but do the design and engineering for what? That is something they have no idea about, except the unrealistic expectation: anything the customer asks for.

Clearly developing an auto-generator with the flexibility to generate a structural model for anything under the sun is a major undertaking if not impossible. To achieve that would require some form of scripting or modelling language to define a new structural form without going back to source code programming. There would still need to be development of a new interface for the user to incorporate the new MSP, it wouldn't be something done at PoS whilst talking to customer. {NB: MicroStran as a macro language for auto-generation of complex geometry: as I remember spirals and such. I never really used it, its limited, and its a variation of the .arc file, it just removes the need to define all the nodes and connectivity of such.}

So whilst there may be flexibility in the background behind the scenes, in the foreground there is necessary restraint on what the salesperson and customer can do. Otherwise could simply use BIM type software at PoS and feed into the back office for engineering at some other time after PoS. The whole point of MSP's is that such structures are as close as possible to being off-the-shelf buildings with a fixed price and comparable from one supplier to another. Too much variation and its no longer an MSP. A car is pre-engineered, you cannot go into the showroom and ask for it to be supplied with 3 wheels instead of the standard 4: such variation imposes a need for extensive engineering which includes building of prototypes and physically testing that mathematical models are valid and not overly unrealistic. Choosing whether to have a radio in a car or not doesn't typically impose a need for additional engineering. Putting a window in a shed doesn't impose a need for engineering, placing a door in a shed which is wider than the spacing of the frames imposes a need for engineering. Placing a building in a wind loading environment it wasn't originally designed for imposes a need for additional engineering. Most of this structural engineering is little more than calculations to codes of practice. Not all of the engineering however is within the scope of calculation and some physical testing is often required. Hence the importance of defining the product before writing the software. 

Software is also a product and it should also be designed before being written. However it is recognised that the manufacturers just want to get on with manufacturing. You don't get the full technical drawings and specifications for a car, but then again the customer is not permitted to make changes to the car which would require such information. The building industry does permit making changes which requires the additional information, and therefore they need to make it available: which is extremely difficult if they haven't produced it.

There is a lot of up front work to do before software the likes of MicroStran or Multiframe can be put to use. Most MSP's are well within the capabilities of Australia's 2 year qualified Engineering Associates to design and assess, and such persons are likely more compatible with the needs of the majority of the manufacturers than professional engineers. So if the structural models for the MSP's can be auto-generated and be compatible with commercial structural analysis software, then the engineering associates can handle custom variations in house, a lot more rapidly than queuing up for assistance from a consulting engineer. If such personnel can be employed along with appropriate software then the need for PoS software would be slightly reduced, because going to the manufacturer would still be faster than going to an architect and then going out to tender.

So would say we are looking at:
  1. PoS design solutions in no more than 5 minutes during at most a 30 minute interview.
  2. Behind the scenes solutions in less than 24 hours: same day response.
The latter can be achieved by semi-automating the readily available general purpose structural analysis software. The former requires full automation at PoS, and requires highly specialised custom software.

In the first instance therefore I would recommend semi-automating the readily available software and employing the right people on staff. Such tools would enable all consultants and increase the number available who can deal with custom variations to MSP's, including future extensions, and who can test and certify the more specialised software, and otherwise independently check and certify individual building projects. Diving straight into the specialised software owned by the manufacturer and only available to their sales distributors just creates a lot of hassle for everyone else involved

Internally to an application which auto-generates a structural model it doesn't really matter what structural analysis software is used. That is the difference between generating a model for cpframe or MicroStran is simply a matter of file format: the data required for the files is the same. Therefore a model generator written for cpframe if written appropriately can be easily adapted to a multitude of general purpose frame analysis software by writing an appropriate file export procedure. The principal task of generating a model already completed in the main body of the application.

An MS Excel Template for cpframe.
The simplest, though not very practical, means of generating a model for cpframe is to use MS Excel and perform all the calculations in the worksheet and connect these to a summary sheet which as the appropriate format for writing to a cpframe data file. To such end I have thrown together a template for such purpose called frontEndPFrame01.xls.

As its a template file it doesn't actually do anything. The template merely provides a worksheet which can be adapted to suit various structures by adding extra data rows as required. Buttons are provided to write the contents of the worksheet to a data file compatible with cpframe, and then run a shell to execute cpframe to generate the results file. As the application is procedural, can be reasonably certain that the data file will exist before the next command is executed, therefore these tow buttons can be combined into a single command button: with the data file being automatically generated and passed straight to cpframe. On the other hand cannot be certain the results file is available, before the next command executes, as cpframe may still be in process. Therefore have to have a separate manually executed task for viewing the results. This command merely opens the results file using Windows notepad.

As a test my gable frame spreadsheet can be meshed into the template. This spreadsheet calculates the wind loads on a simple gable frame, the same structural form as set up in the front end template. So just need to link the appropriate cells, such as the wind loads on the frame, into the the template data sheet. Since the spreadsheet already contains Kleinlogel formula for the frame, the results of running cpframe can be checked. Since cpframe can only handle a single load case at a time, conditional formula would be required to switch the load case the file is being generated for. This approach I took in 1996 using Quattro Pro (QPro and the original version of pframe. {Actually back then the program was called either f_wrk.exe or frame.exe, pframe was the name of my QPro workbook which drove the program}.

QPro pframe.wb1
As can be seen from the screen capture of the QPro workbook, it contained buttons for collecting data defining the structure, writing the data file for the plane frame analysis program,  running the plane frame analysis program, and then printing out the multitude of worksheets used to design all the various components of the whole building extending beyond the primary frame. To the right of the worksheet is a small table showing the load cases for the old permissible stress design, with a marker showing the current load case for which the plane frame data file is to be created.

Plane Frame Analysis Launched in front of QPro Workbook
With the original version of the plane frame application, I had to manually open each data file and run the analysis inside the plane frame application. With cpframe those steps are removed. Still with the original application that is how I manually incremented the heights of the structures until I broke the section desired to be used and then produced standard calculations for maximum height structure possible. {Stepping back of course to what did work. I also happened to know which load case was most likely to be the critical load case therefore I only need to increment height for one load case, and check other load cases when it appeared the maximum height had been reached.}

The QPro application never got fully converted to MS Excel for a variety of reasons: basically all the building blocks are there, just not connected up. Any case using the worksheet to do the calculations is not the most efficient way to do things. Its intuitive and fast initially, but cumbersome at later date when need to add additional members and load cases. It is better to use the wind load functions in schTechLIB directly in vba code using arrays, rather than reference the functions to carry out calculations in worksheet cells. Calculations in the worksheet cells are fine for presenting the calculations but a hindrance to more complex tasks.

Download frontEndPFrame01.xls

For a more flexible approach I will release an alternative template, with the data spread across multiple worksheets, making it easier to shrink and expand the structural models, and provide easily reading existing data files into the Excel workbook. Generating multiple models in the one workbook is not compatible with worksheet calculations. Multiple models is a definite move towards using vba or other programming language.

Users of the software must accept this disclaimer of warranty :
The software is supplied as is. The author disclaims all warranties, expressed or implied, including without limitation, the warranties of merchantability and of fitness for any purpose. The author assumes no liability for damages, direct or consequential, which may result from the use of the software.

[03/02/2014] Original

[23/04/2016] Changed download links to MiScion Pty Ltd Web Store