Showing posts with label structural design. Show all posts
Showing posts with label structural design. Show all posts

Tuesday, May 26, 2015

Bundle of my ExcelCalcs UpLoads

{The link to download the bundle is at the bottom of the post.}

Whilst my preference is that my spreadsheets are downloaded via ExcelCalcs and that queries are placed in the ExcelCalcs forum, it is apparent that people request the spreadsheets without need to join ExcelCalcs. Most of my spreadsheets are dependent on links to other workbooks, some .xls and others .xla, in consequence the download limits on ExcelCalcs may prevent new users from obtaining a fully working set of my workbooks. None of my spreadsheets are dependent on XLC , whilst I believe it is good software, I have moved beyond the need to format my calculations in standard text book format. For my comparison of MathCAD/SMath type applications versus spreadsheets read:
Electronic Calculations (eCalc's) .
My primary concern is calculating results and making decisions, not documenting the journey taken, as a consequence I make extensive use of visual basic for applications (vba), with MS Excel primarily being used to provide: a file format, editor, and reporting capability.

The spreadsheets are primarily concerned with structural design of manufactured structural products (MSP). Such products mainly comprise of steel, cold-formed steel, and timber sheds and canopies. The spreadsheets are modifications of the production spreadsheets we have used for design for many years at MiScion Pty Ltd (also Trading as Roy Harrison and Associates).

The spreadsheets we use in-house are for more complete buildings, involving member and connection design (eg. schShedDesignerR01.xls is a cut down version). The idea of releasing the spreadsheets was to provide the building blocks for others to build custom workbooks for other more specific building forms. If people want custom workbooks or vb.net/vba applications for their structural product then I can be contacted at MiScion Pty Ltd.

Structural design of a product can be divided between the following three major activities:
  1. Brief Description: Design Brief
  2. Evidence-of-Suitability
  3. Detail Description:Specification
Provision of Structural Calculations primarily falls into the evidence-of-suitability activity. Whilst drawing falls into both the design brief and specification activities.

Structural design can also be considered divided into the following:
  1. Product Structure/Description
  2. Dimension & Geometry
  3. Design Actions
  4. Design Action-Effects
  5. System/Component Stability/Resistance
    • Design/Assessment of Structural Form
    • Design/Assessment of Members
    • Design/Assessment of Connections
    • Design/Assessment of Interface/Supports (Footings)
The spreadsheets are listed below roughly divide into the above categories. For further information links to ExcelCalcs and Blog posts are provided. At present most of the blog posts simple display the ExcelCalcs page, but in the future I will add more detail about the workbooks. Also note that the graphics on the ExcelCalcs page were put there by the site administrator not myself, and don't always reflect the nature of the spreadsheet: and editing the page is limited, therefore the blog posts here will be up dated and modified first.

(c)Copyright 2015 Steven Conrad Harrison
The Bundled Package Comprises of the following Files:
FILENAME DESCRIPTION Blog ExcelCalcs
gpl.txt

readme.txt

Chart9BTC3 y04m06d14.pdf ColdFormed Steel Sheds Australia Height Span Limits of C-Sections. blog ExcelCalcs


TECHNICAL LIBRARY
schTechLIB.xla Libary of functions. schTechLIB contents blog ExcelCalcs
schTechLIBV2.xla Library without DAO references

ENVIRONMENT
Beaufort.xls Beaufort wind Scale blog ExcelCalcs
as4055.xls AS4055 Simplified wind loading for products blog ExcelCalcs
as4055v1.xls AS4055v2 Simplified wind loading for products blog ExcelCalcs
schWindAssessment_r02.xls Wind Loading to AS1170.2 blog ExcelCalcs



DIMENSION & GEOMETRY


drawWorkSheet2009.xls Experiments with Parametric Sketches using XY Charts. blog ExcelCalcs
schAcadLTCivilScriptWriter.xls Civil engineering Long Profiles and Sections. blog ExcelCalcs
schBuildingDimensions.xls Dimension and Geometry of Gable Frame shed Frame Member Lengths and Bracing Lengths. blog ExcelCalcs
schDrawSection.xls Draw Sections. blog ExcelCalcs
schCADDv2.xls CADD. blog ExcelCalcs


drawShed.zip CAD: Automatic generation of framing plans and elevations simple gable frame. blog ExcelCalcs
sample.dwg

schDrawShed.xls



vbaDXF.zip VBA Experiments Parsing ACAD DXF files. blog ExcelCalcs
DXFtoolsV01.xls

vbaDXF1.xls

vbaDXF2.xls

vbaDXF3.xls



drawShedDC1.zip CAD: Experiments with DesignCAD: Draw 3D framing of American Barn type structure. blog ExcelCalcs
Column1.dcd

schDrawShedDC1.xls



ExcelShapes.zip VBA Experiments with Excel Shapes Layer: Structural Framing Plans. blog ExcelCalcs
struMtrl.mdb

shapesTut01B.xls



drawTut.zip VBA Experiments with ACAD Script Automation. blog ExcelCalcs
drawTut01.xls

drawTut02.xls

drawTut03.xls

drawTut04.xls

drawTut05.xls

SampleSCR1.xls

UnSymmetricalGableSCR.xls

vbaDraw01punch.xls

vbaDraw02.xls

vbaDraw03.xls

vbaDraw04.xls



schHolePunching.xls Estimating: Hole punching requirements for roll-formed sections.




PRODUCT STRUCTURE TREE
bomStructureTreeStage3.xls exploded BOM (Bill of Materials). blog ExcelCalcs
schBOMStructureTreeStage1.xls Indented Bill of Material. blog ExcelCalcs


explodedBOM.zip IE/POM/CAPM Automatic Explosion of Bill of Materials. blog ExcelCalcs
Assemblies.xls

Materials.xls

mrpBOMv2.xls



ASSEMBLY ANALYSIS/DESIGN
schGableCanopyTimber.xls Gable Canopy to Australian Codes. blog ExcelCalcs
schKleinlogel03.xls Kleinlogel. blog ExcelCalcs
schShedDesignerR01.xls Wind Loads on Gable Frame to Australian Wind Code AS1170.2. blog ExcelCalcs


schDesignEngineR01.zip Application for Generation of Height Span Charts Gable Frame Sheds. blog ExcelCalcs
AcadScript.xls

BeamCalc.xls

Building00.xls

DBGtrace.xls

DataCosmos.xlt

DesignEngine.xls

GUI_lib.xls

Geom3D.xls

HeightSpanTableForm.xlt

Klein3.xlt

Primer.xls

RigidFrame.xls

Structure.xls

XStrings.xls

Xmaths.xls

as1170.xls

as4600.xls

struMtrl.mdb





MATERIALS
schStruMtrl.xls Structural Materials Data Steel. blog ExcelCalcs
schTimberMatrl.xls Timber Data for AS1720. blog ExcelCalcs
struMtrl.mdbMS Access database of properties. origin of schStruMtrl.xls.
MEMBER DESIGN
schDsgn1720.xls Calculator assessment of timber structures to AS1720. blog ExcelCalcs
schColdformedCee.xls Example Using Circular References to Force Iteration: Calculation Effective Section Modulus for Coldformed C-Section to AS4600. blog ExcelCalcs
schDsgn4600.xls
schDsgn4600R2013.xls
Calculator for assessment of cold-formed steel structures to AS4600.

Further information on set up can be found here.
blog ExcelCalcs
schDsgn4100.xls Calculator for assessment of steel structures to AS4100. blog ExcelCalcs


CONNECTIONS DESIGN
schTechNote022pt2.xls Tables for strength of bolted joints in thin cold-formed steel sheets to AS4600. blog ExcelCalcs


PRODUCTION AND OPERATION MANAGEMENT
schPlannerCalendar.xls Planner Calendar. blog ExcelCalcs
schWorkStudy.xls IE: Work study flow process chart. blog ExcelCalcs


GEOGRAPHICAL INFORMATION SYSTEMS
centralPlaces4.zip Experiments with Geographical Information System (GIS) central places. blog ExcelCalcs
CentralPlaces4ShedSuppliers.xls



MISCELLANEOUS
vbaObjects.zip VBA Experiments with Class Objects. blog ExcelCalcs
objTut01.xls

objTut02.xls

objTut03.xls



dataStruct.zip VBA Experiments with Abstract Data Structures. blog ExcelCalcs
dataStruct00.xls

dataStruct01.xls

dataStruct02.xls

dataStruct03.xls

dataStruct04.xls

dataStruct05.xls

orgDataStru.xls

treeExperiments.xls



vbaTuts.zip Excel/VBA Tutorials. blog ExcelCalcs
Node.dwg

NodeA.dwg

MyTest.txt

MyTest2.txt

TestNodes2.txt

vbaTut33.TXT

vbaTut00index.xls

vbaTut01.xls

vbaTut02.xls

vbaTut03.xls

vbaTut04.xls

vbaTut05.xls

vbaTut06.xls

vbaTut07.xls

vbaTut08.xls

vbaTut09.xls

vbaTut10.xls

vbaTut11.xls

vbaTut12.xls

vbaTut13.xls

vbaTut14.xls

vbaTut15.xls

vbaTut16.xls

vbaTut17.xls

vbaTut18.xls

vbaTut19.xls

vbaTut20.xls

vbaTut21.xls

vbaTut22.xls

vbaTut23.xls

vbaTut24.xls

vbaTut25.xls

vbaTut26.xls

vbaTut27.xls

vbaTut28.xls

vbaTut29.xls

vbaTut30.xls

vbaTut31.xls

vbaTut32.xls

vbaTut33.xls

vbaTut34.xls

vbaTut35.xls

vbaTut36.xls

vbaTut37.xls

vbaTut38.xls

vbaTut39.xls

vbaTut40.xls

vbaTut41.xls

vbaTut42.xls

vbaTut43.xls

vbaTut44.xls

vbaTut45.xls

vbaTut46.xls

vbaTut47.xls


The zip package can be downloaded free off charge from MiScion Pty Ltd: spreadsheet Bundle . MS Excel should automatically update the workbook links to the current folder. If create a subfolder of "My Documents" called eCalcs and below this create a folder called materials. The materials data files should be placed in this folder. The materials files are:
  • struMtrl.mdb
  • schStruMtrl.xls
  • schTimberMatrl.xls

Revisions:


  1. [26/5/2015] : Original Bundle Release
  2. [11/6/2015] : Updated the zip file to include revised versions of workbooks which had previously been uploaded to ExcelCalcs. These mainly comprise of changes to the AS4600 and AS4100 workbooks, which now have a button to open the section library, and  also worksheet application parameters to enable the DAO functions to find the MS Access database of sections properties (this currently only required for AS4600.). For more information refer to : My spreadsheets DAO and 64 bit Windows 7. For those not using AS4600 there is also a alternate version of schTechLIB which does not have the references to Microsoft DAO 3.6 object library, this is named schTechLIBV2.
  3. [01/02/2016] : Changed source of zip file from dropbox to MiScion Pty Ltd (the family business)

Monday, April 07, 2014

Wind Loading Surface Roughness Length versus Terrain Category

A simple spreadsheet making use of AS1170.2:1989 Appendix E. This appendix provided formula for converting surface roughness length (z0) into terrain category and for region A the calculation of the terrain adjusted wind speed or otherwise the terrain category multiplier (Mz,cat).

AS1170.2 commentary contains a design chart which identifies various terrains and gives the surface roughness length (z0). From this chart it is apparent that most rural properties are not TC3 but also they are not TC2. By the use of z0 an intermediate terrain category can be calculated and more economical designs are possible.

The spreadsheet can be downloaded here: surfaceRoughnessLength.xls



Revisions:
[07/04/2014] Original
[23/04/2016] Changed download links to MiScion Pty Ltd Web Store

Wind Loading Risk Assessment

A simple spreadsheet which allows varying life expectancy and calculates the mean return period (R) for the regional wind speed. It is based on formula in Wind Loading of Structures by John Holmes.

It also attempts to map the calculated mean return period to the nearest building code of Australia (BCA) importance level, and return the associated mean return period for such. This part is limited to non-cyclonic regions: A and B.

The file can be downloaded here: windRiskAssessment2014.xls



Revisions:
[07/04/2014] Original
[23/04/2016] Changed download links to MiScion Pty Ltd Web Store

Wind Loading BCA Importance Levels

A simple spreadsheet that maps Building Code of Australia (BCA) importance levels against the annual probabilities of exceedance (actually mean return periods). Excel trend line facilities are then used to get a formula so that any mean return period can be mapped to an importance level.

Whilst AS1170.2 allows for any mean return period, the BCA has limitations, which are not always suitable, starting with the fact that the BCA is primarily about habitable buildings and anything which is not a habitable building becomes BCA class 10. However the BCA is not suitable for design of any structure classified as class 10. For example a garden shed or carport may be suitable to be designed to BCA volume 2, but a 600m high radio mast is not.

Additionally importance level 2, with mean return period (R) of 500 years is not suitable for all buildings, and then again an importance level of 1 (R=100 years) is not always suitable, therefore helpful to identify alternative importance levels: to show that have importance between 1 and 2.  For those who prefer importance levels.

The spreadsheet can be downloaded here. bcaImportanceLevels.xls



Revisions:
[07/04/2014] Original
[23/04/2016] Changed download links to MiScion Pty Ltd Web Store

Thursday, February 13, 2014

Plane Frame Analysis Front End Version 4

frontEndPFrame04.xls
Now converted the variables, used in the various plane frame analysis applications, from records to classes. This version of the front-end for cpframe.exe can read a data file into the variable, then store the contents of the variables to the worksheet. It can also read contents of the worksheet and store in the variables and then write the contents of the variables to a file.

In this way existing data files (.dat) can be opened and edited in Excel. Since reading a file into a workbook over writes the contents of the cells, its not a good idea to calculate the geometry or required loading in the worksheet cells, it is better to generate the model using vba which is the point of the exercise. Using the variables and data structures now defined can write vba code to build a model directly into those variables and by pass the worksheet. The worksheet was just used to verify can read and generate the file through the use of the variables. The file format of the output doesn't match the file read, as the original data files were generated by QPro. Once I had got QPro to generate data files which pframe could read, I didn't worry too much that the file didn't match the exact format of the files saved by pframe. When converted to Excel the format for numbers was taken from the format assigned to the Excel cells, this is no longer done, the format is now hard coded into the vba code. This was done because the file read macro first clears the cells, whilst this has been modified to only clear the contents, the macro can read data files larger than the maximum formatted data area (highlighted in yellow in worksheets) and therefore some format will need assigning to these extra records. The hard coded formats also closely match the original Pascal formats: not liking some of the original formats I have changed them with out affecting the ability of the original application to read the data (thus far).

From this point forward will need to move away from buttons in the worksheet. As far as I know, vba variables have a life as long as the execution of the macro. Thus when macro stops, the variables cease to be accessible, though the memory may not be cleared. This is the reason for storing the contents of the variables to the worksheet, and then reading back from the worksheet. Though I did read recently that global variables do retain values between vba calls: maybe. Before  including the subroutines for reading the data into the variables I did trial reading then writing without storing and retrieving from the worksheet first: the two buttons seemed to work. However I put that down to valid data still being in memory, and same chunk of memory being addressed when macro next macro run, in other circumstances the data may not be valid and may address different chunk of available memory. This is based on past experience with Excel 97, though at that time I admit not overly familiar with use of the Public keyword, so past failures to initialise variables via open workbook method may have been because of not having global variables: though that seems it would have complained about unknown variables and the initialisation wouldn't have worked. Not something I really want to test or rely on: so will assume that from one vba call to the next all variables are wiped. Besides, since making the test for this application, I have removed the global variables and made local to the main class.

So on the assumption that variables are wiped from one vba call to the next it is necessary to keep the vba code active, and to do that need to move buttons from worksheet to a vba form: and have the form retain control until all tasks complete. The alternative is to repeat large segments of initialisation code on each vba call. This latter approach was adopted for the wind loading functions contained in schTechLIB, each function calls subroutines which load arrays which are then searched for data: may have to revisit that and see if I can initialise once.

Another problem to be addressed is access to the variables external to the class in which they are defined. The variables are arrays and vba does not allow public arrays in the definition of a class. So whilst converting records into classes in Turbo Pascal and Delphi is relatively easy its cumbersome in vba. An alternative is to use collections in vba, these can be public, however these are highly likely to be incompatible with the analysis routines of plane frame. {Noting that idea is to incorporate plane frame in the code not shell out and pass data file to a console application. The console application is just a development tool, and not affected by any code I write for the front-end creating the model to be passed to the frame analysis. The first task is to auto-generate a valid structural model of a manufactured  structural product (MSP): not analyse it, that task comes later.}

If don't use collections then need to write property (Let, Get, Set) methods to access the individual elements of the arrays. At present only written one of the classes in Pascal, vb.net and vba, and use of collections keeps vba most similar to Pascal and vb.net.

So as to which is the better option I will work out as I attempt to gain access to the variables and define a structural model.

Structural Design
At the moment I assuming that structural design is split into three stages:

Stage1
Consists of creating the structural model:

  1. Dimension and Geometry
  2. Design Actions

Stage 2
Consists of Analysing the structure and determining the reactions and action-effects. The analysis used depends on the analysis used. The tools used for the analysis depends on the complexity of the analysis.

Stage 3
Consists of:

  1. Component sizing
  2. Connection Design
  3. Footing Design

However the whole process is not all that simple, as connections and footings themselves have their own components and may require repeating stage 1 to stage 3, using different analysis methods suited to the details of the component considered. That is steps 2 and 3 of stage 3 can be removed and these components considered as requiring their own structural models. {eg. connections modelled in isolation along with applied forces, using finite element method}

So cpframe only covers stage 2 for structures which can be modelled as plane frames. The front-end for cpframe covers stage 1, and the back-end covers stage 3. There is plenty of software available for stage 2 and stage 3: though not much choice for software covering Australian materials codes. It is stage 1 however where the problem lies with respect to rapid design of manufactured structural products (MSP). For simple MSP's a few simple calculations in a a spreadsheet will suffice for all stages, for more complex structures some analysis engine is required for stage 2. If using an analysis engine then a structural model needs to be built from simple user input. For structural frameworks the form of the structural model is relatively similar, for example I only need to add a few extra fields, add WriteArc methods to all the classes to export the model to a MicroStran .arc file. These methods I already have in classes specifically written for writing MicroStran files. {Pity I think MicroStran even needs to import the arc file. Otherwise could launch that instead of cpframe.}

So given that the data structures required to represent a structural model are relatively similar I can create my own data structures and use to create models for export to what ever structural software I choose. Note not concerned with picking up the dimension and geometry from a CAD model and adding loads to it: the dimension and geometry are to be auto-generated from a few simple parameters and no other variations permitted. If want other variations then export a model for import into general purpose structural analysis software. An MSP has a defined structural form and a limited set of parameters: if the structural form changes then it is no longer the off-the-shelf MSP. I don't see the purpose of software to permit design at point-of-sale (PoS) but to constrain the options available so that the customer knows they have just defeated the entire point and purpose of going to a supplier of MSP's: and consequently some significant period of design and engineering is not required. On the other hand want to make more customisable options available at the PoS. So that eventually get auto-generation of basic options and then manually edit more custom options: with the allowable editing constrained. {NB: Some advanced level CAD systems for more than 20 years now, have been able to define dimensions by mathematical expressions. For example the span can be set to twice the height. Plus various nodes or points can be used as constraints and references. However that doesn't necessarily equate to being able to increase quantity of components, such as varying the number of portal frames with the length of the building. Also in the building industry they don't pay to much attention to assemblies and sub-assemblies: so the building rather than being a series of portal frames becomes a forest of columns with a collection of rafters. Thus 3D modelling can change the perception of the structure: getting away from the sub-assemblies which make it up.}

Any case the focus at the moment is auto-generation of the structural model, and ultimately being able to analyse that model using various tools to get comparative checks on structural adequacy, so that independent technical checking is viable. In the past and even now, independent checking is a problem, the manufacturers use propriety software generate and check compliance in a few minutes and the city councils engineer certifying the structure may require a week to check the design using general purpose tools: or otherwise simply reading through the reports produced.

So my view is to provide the tools to the certifiers and general designers first for use with general purpose structural analysis tools. This in turn increases the potential for MSP's in the first place, and with MSP's comes the manufacturers desire to limit what the sales team can sell to keep the production economical. So go from the flexible to more and more constraints.



Download frontEndPFrame04.xls .

DISCLAIMER :
Users of the software must accept this disclaimer of warranty :
The software is supplied as is. The author disclaims all warranties, expressed or implied, including without limitation, the warranties of merchantability and of fitness for any purpose. The author assumes no liability for damages, direct or consequential, which may result from the use of the software.



Revisions:
[13/02/2014] Original

[23/04/2016] Changed download links to MiScion Pty Ltd Web Store

Tuesday, February 04, 2014

Future Development of our Plane Frame Program

It is not the intention to write a stand alone frame analysis program. Software such as MicroStran and Multiframe serve that purpose reasonably well.

Similarly it is not the intention to write software supporting a building information model (BIM). Software such as ArchiCAD, Revit Structural, RAM Build, Tekla (XSteel/XEngineer), Master Series Master Portal serve that purpose.

None of this software however is suitable for manufacturers of structural products who do not employ engineers on staff. Employing an engineer on staff is not overly viable: one a shortage and second the problem of a graduate gaining appropriate experience and guidance to do the required job in such places. Also civil engineers are not overly suitable any case they don't have the repetitive manufacturing and product development knowledge. They typically design buildings one at a time for a specific purpose to comply with one set of regulations. Local regulations are irrelevant when designing a product, the product may have to meet regulations in multiple regions with minimum variation to zero variation in its form. On the other hand mechanical/manufacturing engineers lack the building structures knowledge. More importantly an engineer employed on the staff of these manufacturers is likely to get bogged down dealing with the day to day hassles of cleaning up the mess created by the sales people.

The sales people and their customers need to be enabled and empowered to make the right decisions to keep their company out of trouble and avoid hassles for the customer. I've trialled tables: they always want smaller increments to the tables and usually want to opt for the unconservative direction when making a choice. I've tried graphs: there are unusual and incorrect ways to read simple graphs. Also a single graph or table doesn't specify the requirements for a complex assembly, multiple such design aids have to be used and so complicating the process. Hence software is seen as the solution by the manufacturers and they typically seek an Excel spreadsheet. An Excel spreadsheet typically because they know of something similar used else where or because they know we and others use Excel for our calculations.

Spreadsheets with calculations directly in the worksheet are not however a sensible option. It can involve a considerable amount of work just to add one additional structural element. Using arrays or records such additional structural element can be done with relatively minor effort. Therefore using a programming language like vba/Pascal and a database like MS Access or sqlite is more efficient use of computing resources. Efficient use of computing resources potentially becoming important again as the use of smart phones increases, along with Internet services.

From such perspective Java and JavaScript may be considered the more preferable programming languages. As these are more readily available programming languages for the Internet and Android phones.

To convert either the vba or Pascal source to vb.net, I have to convert all arrays to zero based indices. Since Java is largely C like in its syntax I assume it also requires zero based arrays. Therefore could adopt Java in preference to vb.net, however Java is likely to impose more constraints on the use of classes. Further as with Lazarus, Java has the problem that it isn't immediately compatible with the .net framework: and writing either a COM automation server or .net component is going to be more complex than with Visual Studio and vb.net. Since I would like to produce something which can be used in either Excel, MS Access or referenced in some other application, COM or .net is important.

At present however it is easier for me to convert the Turbo Pascal source into using class/objects and to using zero based arrays. To do this in parallel with further development in Lazarus vb.net and vba. At the same time modify the program so that it is based on constructs which make it easier to translate to Java and C#. Where C# provides compatibility with the .net framework and Java with Android: with the two languages being about as similar as vb.net and vba.

The approach may be slow, but it will permit the modules of the intended larger application to remain functional in at least one programming environment at any point in time.

pframe needs a back-end for member design for steel, cold-formed steel, timber and aluminium members. Currently this is only available in vba. Connection design only available in Excel worksheet calculations.

Developing the back-end however I consider to be off secondary importance as such facility to a limited extent is already available in software like MicroStran and Multiframe.

The feature missing from most of the commercial software for the Australian market is auto-generation of the structural model. Sure the software can generate the dimension and geometry for a few common structural forms: but the software doesn't auto-generate the structural model with the loading. Nor is the software able to iterate through various structural models subject to some constraint: such as increase height until some maximum moment is reached. {Though Multiframe is a COM automation server and can be programmed for such task}

With respect to auto-generation of a structural model in the form of data files for pframe and MicroStran that is already written in Delphi and vba. Converting it to vb.net is expected to be relatively easy. However having auto-generation as an integral part of pframe would be a lot better, rather than interacting only via data files. Since converting pframe to vb.net is the obstacle, merging the Turbo Pascal and Delphi source seems the fastest approach: other than the wind loading is obsolete. The wind loading however really needs modifying so that it gets data from files rather than hardcoded arrays: for which purpose XML files maybe suitable.

History of the Development of our Plane Frame Analysis Application

The earliest dated Pascal source files I can find on our systems are dated 1990, however I didn't start using the program until the 1996 version. The 1996 version uses Turbo Gadgets to provide walking menus, and it otherwise uses the full graphics screen to display moment diagrams etc... This I wrote about in a previous post on my blog: since its a DOS based program using the full graphics screen its not overly compatible with windows: works Windows XP but not windows 7.

Back in 1996 the program was the only frame analysis program Roy Harrison & Associates (Now MiScion Pty Ltd) used. A few years later we purchased a MicroStran license, and a few years after that, rather than get a 2nd Microstran license we got a Multiframe license as a comparative check. Once we had the commercial software, this simple plane frame program fell out off use. {The application at this time had names alternating from f_wrk.exe to frame.exe}

Over the years however we have been repeatedly asked if we can supply software to help our clients sales personnel customise their structural products at point-of-sale (PoS). None however were really serious about investing in the development of such software: just an hopeful expectation that since we did everything by computer and we wrote the tools for such, that we could just throw something together. Not that simple.

The program as released here (cpframe), is the console(c) version of plane frame (pframe), I have removed the user interface. I never really used the user interface of the original program: this is because I wrote and used Borland Quattro Pro (QPro) macros to generate the data files. Our original projects involved standard calculations for sheds and carports, subject to a set of conditions but with a requirement to find the maximum height possible. Since wind loading is dependent on height, I needed to calculate new wind loads for each height until I found the limits of the structural section being used.

pframe.wb1
I could have wrote additional Pascal code and merged with pframe to achieve such objective, however already had QPro spreadsheets for general wind loading calculations, it was therefore faster to write QPro macros to generate the data files for pframe.

Similarly could have wrote more Pascal code to integrate coldformed steel design AS1538/AS4600 into pframe, but once again already had QPro spreadsheets for general design of members and connections. In terms of solving clients current problems had no time to develop a full application in Pascal.

AS1538.wb1
Now whilst QPro could launch pframe, pframe didn't support command line parameters. Therefore I had to manually open the data file and run the analysis, then import results in QPro. Since pframe only supports a single loadcase, I had to repeat this procedure for each loadcase. Pframe was thus a bottleneck in the whole process that I wanted to remove. {This bottleneck was slightly improved once we got MicroStran and I modified the QPro macros to generate microstran .arc files. Microstran could deal with all loadcases and envelope the extreme values, thus reducing the total number of steps compared to using pframe.}

Back in 1996 however I had an aversion to touching the Pascal source code for pframe written by my father and adding the desired command line parameters or otherwise expanding the program. The aversion stemming from not having any simple and rapid means of testing the program to ensure I hadn't messed up the calculations.

So I needed an alternative method. Whilst moment distribution is easy enough to learn for continuous beams, its an otherwise cumbersome approach for plane frames (though extremely practical when there isn't anything else). American texts tend to show moment distribution calculations littered about a diagram of the frame whilst others present a more tabular approach. Either approach seems messy, and prone to error. Since I had a steel designers manual with Kleinlogel formula for the frames I was most interested in, I adopted Kleinlogel formula.

Still using Kleinlogel formlua is time consuming and prone to error, so I set up a spreadsheet to do the calculations. Spreadsheets being slightly cumbersome however, I decided to also program the calculations in Delphi 3 (Pascal).

Qpro macros for AS1170.2 wind loading
Parallel to this was the desire to expand the wind loading calculations and make them more complete and not just limited to the few scenarios most often encountered and manually handling other scenarios. Since more familiar working with arrays in high level languages like Fortran, Pascal and C it seemed easier to develop the wind loading functions  in Delphi (Pascal) than in the QPro macro language. So wind loading calculations were thus developed in parallel in QPro and Delphi, one used as a check against the other.
Now the problem with QPro macros is that the calculations are not up to date unless the macros had been run, and these were either run when the dialogue boxes collecting data were closed or when an hot key was used. This made the spreadsheets slightly cumbersome, but I had read somewhere that there was potential to create a dynamic link library (DLL) and add functions to QPro and this seemed possible using Delphi. Though I had't read in detail how to do so, and it seemed complex anyway, the potential was there. Hence the parallel developments in Delphi and QPro were not considered wasteful as they were expected to merge at some point in the future.

However, whilst wandering around book stores during my lunch break whilst working on contract in the city, I bumped into a book on Programming Office 97 using Visual Basic for Applications (vba). I had read some articles in computer magazines about Excel/vba but wasn't sure how vba related to Excel, plus I had a bias towards Borland software. Still I bought the book.

I had been hoping that Borland would merge the IDE's for Turbo Pascal, Turbo C and Turbo Basic and ensure they had the same function libraries, had built in language converters, including converters to turn GWBASIC into simple QPro spreadsheets or Paradox applications, and further more would be the programming languages for Paradox and QPro. They kind of did this, they threw Paradox and QPro away and developed Delphi and C++ Builder. Corel QPro I didn't like it was buggy. However, it was our 2nd QPro for windows license, and time was being wasted modifying my Borland spreadsheets to work in the Corel version. I didn't want to solve that problem by getting an additional Corel license. I was looking for an alternative spreadsheet, after reading the book on vba and office 97, I went and bought Office 97 Professional and the systems developer kit (SDK).

The wind loading functions I had programmed in Delphi, I translated into vba, thus making them available as functions in the Excel worksheet. A simple change to a spreadsheet cell and all the wind calculations upto date and the frame analysis using Kleinlogel also completed. Manually changing parameters in the spreadsheet I could quickly map out the potential of all the available c-sections for use in cold-formed steel sheds.

But still had a problem. Translating the dialogue boxes from QPro to Excel 97 wasn't so easy. Connecting the dialogue boxes to cells in Excel seemed cumbersome compared to QPro, I may have been doing it wrong but I had no real references for such. I tried the QPro way and that didn't seem to work: a similar approach does work in Excel 2003. Though there is still an issue of being able to abandon the dialogue box and not automatically updating the worksheet: such was not a problem with QPro1. Besides it appearing cumbersome to allocate data to drop down lists on the dialogue boxes, there was another problem with Excel and that was timing. There seemed to be a timing problem between getting data from dialogue boxes, evaluating user defined functions (UDF) and updating worksheet calculations. Either crashing or simply not calculating the correct answers.

Initially I had tried to replicate the QPro like wind loading macro's making use of the worksheets to store the data tables, but that appeared to be part of the timing problem and therefore I decided to abandon that approach in favour of using arrays in vba. Due to the problems with dialogue boxes, I abandoned them in favour of setting up input forms fully in the worksheet. Once the scope and life times of vba variables were better understood the workbooks worked fine.

But due to the problems encountered with programming vba, development continued in parallel in Delphi. I did attempt to iteratively change the structure height in an Excel worksheet and capture the resultant maximum frame moment and tabulate them. But a clear timing problem in that situation: the height can be changed faster than the worksheet can update the calculations. Incorporating delays could have probably fixed it, but why incorporate delays when objective to get calculated information as quickly as possible. Hence the Delphi program was expanded to iterate through the heights, or through the spans or through both heights and spans and calculate a table of maximum moments.

I then decided to produce a height/span chart and charting in Excel seemed easier than using Delphi graphics. Using Excel I could produce and store tables and charts and any other reporting may want, further more I could control Excel from Delphi. Unfortunately the Excel type library didn't import properly into Delphi due to keyword conflicts. The consequence was that programming Excel from Delphi was being done blind. Bit of a problem as Delphi uses [] for arrays and () for function parameters whilst vba uses () for everything. So that part of the application also needed parallel developments in Excel and Delphi, test what I needed to do in Excel then translate to Delphi.

This however was interrupted by changes to the wind loading code (AS1170.2), and since developing the application was a side line to day to day structural design, it was more important to update the general purpose wind loading Excel workbooks rather than update wind loading in Delphi. As a consequence Delphi was abandoned for developing the height/span charts, and it was all written in Excel/vba, all calculations in vba and thus avoiding problems of timing with worksheet calculation up dates.

Since we had been going down the path of developing a stand alone application in Delphi, pframe was part converted to Delphi to create a windows application with the graphics for the moment diagrams added, but without the rest of the interface developed.

However due to all the member design (AS4600/AS4100/AS1720) all being written in Excel/vba only, and the wind loading up date issue, it was considered that move over to development in Visual Basic might be more productive. So we obtained Visual Studio (VS) 2003 and VB.net, and then to get a second license we ended up with VS 2005.

But spreadsheets are still easier to format reports in, than messing around programming Delphi or VB.net. Sure those who prefer MathCAD type presentations think otherwise: that Excel is poor for presentation. For some reason there was some resistance to pushing forward with Delphi or VB.net development because of resistance to plain ASCII text files (.txt) or rich text files (.rtf), and complications of print preview and getting things to printers. But none of that is really a problem with MS Windows as notepad(.txt) and wordpad(.rtf) are available on each machine. Sure there is a possibility that the user can modify the results: but they would have difficulty proving and replicating such contended error in the program. Further today results are typically sent to pdf files rather than paper print out: and the pdf files can be edited.

Which is another point we had started to trial, generating pdf files, and produce all electronic documents early in the 1990's, but it was cumbersome to use pdf files where we needed results for further calculations. It was far easier to use paper printouts and mark up the required data. Scanning handwritten calculations to incoporate with computer generated calculations also produced massive pdf files, and so electronic documents were abandoned, we didn't have large enough hard disks to store the stuff, and zip disks were expensive. Hence further reason to integrate all the calculations electronically: eliminate the paper print outs for reference and produce smaller pdf files.

VB.net turned out to be significantly different from vba, and therefore it was put aside and pframe was converted to Excel/vba (xlFrame), so that it could interact more directly with Excel worksheet for input data and reporting.

Not long after doing that we were approached to provide a structures module for a carport/verandah spreadsheet. Whilst the structures module is relatively small the spreadsheet itself is relatively large, much of it is data and could probably be better done using MS Access: which is another development track pursued along with Paradox with respect to the materials management.

Now the Delphi application besides using Kleinlogel formula to generate height/span moment tables, it also generates data files for pframe, MicroStran (.arc) as well as AutoCAD LT scripts (.scr), it can also read data files from various other programs written. Much of this has been converted over to Excel/vba and extended further but as separate Excel workbooks. Attempting to gather a lot more vba code together into a single integrated applicataion hit some limit of Excel/vba. Whilst the code seems to run ok, its not possible to work on the modules as attempt to close/save the file it hits a memory limit and basically crashes. It won't save the file except through its recovery feature with all the vba code removed.

Since I don't consider I should use a better computer with more memory, nor that I should reduce the number of vba modules, further development in vba has stalled. Leading me to revisit VB.net.

The expectation was that could simply change the xlFrame into a COM automation object or .net component or similar, which can be plugged into Excel. Then all the parallel developments would disappear as all my wind loading and member design function libraries and the plane frame analysis could all be in vb.net and possibly a single library. Unfortunately that prior problem of the differences between vb.net and vba makes such conversion difficult for the plane frame analysis: though a simple conversion for the function libraries.

Also I want software like the carport/verandah software to export data files compatible with pframe and also to generate MicroStran arc files. When testing the data files exported by xlFrame they were not compatible with the 1996 version of pframe. This led me back to looking for Turbo Pascal source code to compile the original 1996 version of the program, and trace why the new data files were not compatible. Finding source code which compiled and used the same data files as the operational exe file, and which also produced correct results wasn't so easy.

The change in the file format was attributed to a change in how partial loads were defined. The error in the calculations was tracked down to dynamically allocated variables being freed from memory before results stored in those variables was actually used.

So having gone back to Turbo Pascal and given I prefer object pascal compared to vba, especially with respect to arrays in classes, it does seem that further development in Pascal may be the better option than vb.net, with Lazarus being a viable alternative to Delphi. Though a COM automation server may not be so easy to develop in Lazarus as it is in Visual Studio.

Anycase at the moment maintaining parallel developments in Turbo Pascal, Delphi 3, Lazarus, vb.net (VS 2005) and vba as I convert the record data structures into class/objects. The main difference at the moment is that an array has been converted into a collection in vba, though I may convert that back into an array and write property functions to access it. Both these vba approaches seem cumbersome compared to the other languages.

Plane Frame Analysis: Alternative Front End

Created an alternative Front-End for Plane Frame analysis. Instead of a single worksheet with all the data required by cpframe the data has been split between multiple worksheets. This makes it easier to add extra data records for each data set. This also makes it simpler to read an existing data file into the workbook, which may be useful if an auto-generated data file doesn't appear to be producing the correct results or if cpframe cannot read the file.


As primary interest is auto-generation of the models using vba and other programming languages rather than building in the worksheet, the next front-end I will release will read the data into appropriate vba data structures, with facility to save the data to the worksheet or retrieve from the worksheet. Similarly I will write a back-end based on similar data structures used by cpframe to write the results in the first place.

As I convert cpframe to a Windows console application, I will also add an option to read and write directly to MS Excel. {NB: Currently cpframe is MS DOS application and only supports 8.3 file conventions. In converting to MS Windows the intention is that it stays a command line console application.}

The file for the alternative front-end is:

frontEndPFrame02.xls

DISCLAIMER :
Users of the software must accept this disclaimer of warranty :
The software is supplied as is. The author disclaims all warranties, expressed or implied, including without limitation, the warranties of merchantability and of fitness for any purpose. The author assumes no liability for damages, direct or consequential, which may result from the use of the software.



Revisions:
[04/02/2014] Original

[23/04/2016] Changed download links to MiScion Pty Ltd Web Store

Plane Frame Analysis : The Back End

As mentioned in discussing the Front-End, the development of the back-end for frame analysis is of secondary importance, as it has been relatively well catered for by the off-the-shelf structural analysis software. For example software like MicroStran and Multiframe have two options concerning sizing of structural members, and these are:

  1. Check
  2. Design

The "Check" option carries out an assessment as to whether the currently set structural sections are adequate for the action-effects generated by the applied loads, and otherwise gives some indication of the efficiency of such sections. The designer can then adjust some sections in an attempt to get more efficient sections, and run the analysis again and check adequacy. The designer can repeat this until they are happy with the result.

The "Design" option automatically finds the most efficient section for each member, the designer can then opt to use these sections or not. Typically adopting the results of using the "design" option is highly impractical. Consequently the results of the "design" option are just used as a guideline for manually changing some of the members but not all of them. Therefore some human interaction is required to reach the final design selection.

Additionally little of the available software has integrated options for connection design and footing design, this is typically all done external to the program. Also as previously mentioned the 3D geometric model is not necessarily a valid structural model, and therefore there are other components designed external to the analysis software. Some 3D structural analysis software explicitly optimised for buildings, allows components to be modelled graphically but excludes them from the 3D structural analysis and treats the components separately in the manner most appropriate for such components. Thus allowing everything to be modelled visually but with out creating an invalid structural model.

For manufactured structural products (MSP's) typically expect:

  1. Reduced parameter set
  2. Reduced set of components

For example cold-formed steel sheds and carports made from c-sections and employing rigid portal frames, typically have such frames at 3m centres. The frame spacing is thus one parameter which is rarely changed and when it is it is usually reduced. This is because the C75 typically used for girts and purlins is barely capable of spanning 3m: however it wastes little floor space compared to larger sections fastened to the face of columns. The roof pitch is also typically locked. These things often need to be varied but are not in the scope of standard calculations typically held by suppliers, hence the desire for software to allow such variation.

With respect to a true MSP, there is no real need for structural analysis software. Often the issue of analysis versus lookup tables arises, with look up tables being considered inefficient. It is incorrect to conclude that look up tables or databases are absolutely inefficient for the task. In fact there is a good chance that structural analysis is the wasteful inefficient option.

If have something like the span tables of the timber framing code (AS1684) or the girt and purlin load capacity tables for c/z-sections: then for certain automating looking up values from such tables is likely to be inefficient if it is based on simply building a database containing the printed tables. Similarly it would be inefficient to place the span tables for steel carports and verandahs into a database. If look at cold-formed steel sheds then the manufacturers typically have an ad hoc random collection of standard calculations with no rationale behind them: the calculations are of little use to anyone other than for annoying the customers with a failure to have anything compatible with their needs.

Rather than a database of values, the real need is for a database of constraints which can be readily attached to the input boxes on data collection forms. The MSP's are meant to be pre-engineered, thus all the engineering is expected to be done already. The engineering can therefore be used to define constraints and associated solutions and the software can therefore run a lot faster. In other words instead of searching through all the available structural sections to find one that works, from the very start already know what the minimum suitable section is. We know the capabilities of the smallest available section, and also know the limitations of the largest available section. So it is not necessary to carry out structural analysis at the PoS to identify that a proposed building is beyond the scope of the typical cold-formed steel structure, and requires significant custom engineering to make it feasible using cold-formed steel. {eg. We recently designed 32m span x 8m high using Fielders largest Dimond sections. The section is not adequate in it's own right and therefore had to be reinforced with steel flat bar and folded channel. Therefore feasible but not something would get the salesperson to do whilst chatting with the customer at PoS. This is not a Fielders shed, it is just using their materials.}

The database doesn't need to be massive. Further if talking about large databases, then the structural drawings and structural model especially if in 3D represent an extremely large database. Whilst the analysis of a 3D structural model is typically very fast, the automatic sizing of the members by the software can be painfully slow. The earlier versions of MultiFrame for example were extremely slow compared to MicroStran when it came to running the "design" option: now they are about the same, with MultiFrame having got faster. This I expect had more to do with MultiFrames complex multi window user interface than with the algorithm operating behind the scenes. So opting for analysis does not reduce the size of the database, nor does opting for look up tables increase the size of the database. The structural product needs to be looked at carefully, and if it hasn't really been designed that's going to be difficult.

For example each available cold-formed c-section has a maximum achievable envelope which it is suitable for when used for a simple gable frame. Once that section has been selected for a proposed building the connection components and footing system are also largely determined. Therefore only really need to know what the defining envelope is for each c-section. A simple data input form can then automatically update based on constraints in response to simple inputs. Depending on the structural product could all be done by an extremely simple and small Excel spreadsheet.

However all the engineering for the product needs doing first before any such constraints are available and the building industry is not really into being proactive and designing a product to satisfy the needs of a market, it is instead highly reactive only responding when it bumps into and trips over the customers needs. On the other hand if they did decide to be proactive and went to a consulting civil/structural engineer to get a MSP designed they would bump into a series of problems: that's why the manufacturers typically hold a random collection of structural calculations obtained on an as needs basis. An infinite number of points along a line segment of any length, leads to an infinite number of standard designs being required, which is not practical therefore seek software so that parameters can be varied on an as needs basis. Most manufacturers however are too small to pay for development of such software, and also seemingly too small to pay for product development.

My view however is that they could pay for product development if they employed engineering associates on staff and made use of off-the-shelf software. They develop the product in small steps and otherwise provide higher quality service to their customers, by having engineering capability on staff rather than hoping some external consultant is available at the time required.

If focus on product development and having a product available which meets the needs of the customer, then the PoS software can be kept simple and all the design and engineering done in the back-room prior to customer enquiry. The real objective is to predict accurately, what the customer wants, and have it available already, not ask them what they want, and supply at some future date.

Therefore the back-end of frame analysis is of secondary importance as there is now a diverse range of structural analysis software available which can be used for sizing members. Where little effort has been put is auto-generating the structural model: with geometry and loading. This is because the focus for high end software is dealing with geometry which comes from an architect and having to transform this into a structural model.

For MSP's we are only concerned with the structure, therefore more able to generate geometry and loading. The importance of this is that at point of generation we know that a certain structural element is in the roof or the wall and therefore know what loads to apply to it automatically.

For the architecturally defined geometry, do not know that a beam is in the roof, unless it carries additional data which can be interrogated, so that can apply the correct load to it. CAD operators find putting lines on the appropriate layers cumbersome, and commands designed to ensure that entities have the appropriate layer and other attributes even more cumbersome. So the possibility that all elements in a building information model (BIM) are all tagged correctly to allow automation tools to work correctly is relatively low.

For a MSP however everything is supposed to be predefined, and therefore we have far greater potential to auto-generate the structural model. If we can do that then there is plenty of software available for what I have labelled the back-end of frame analysis. Developing a back-end is therfore not something I wish to give priority to, as all this other software provides the needed independent check on the design of the structural product. I have MicroStran and Multiframe licenses explicitly for the purpose of checking one against the other. Most of the time only use one package, but when strange things occur then build models in both packages and check one against the other and hunt down causes of variation if any.

With an auto-generated structural model and a large variety of software available to carry out the frame analysis and size the members, there is reduced potential to question the validity of a manufacturers MSP, as there is potential for a large number of independent checks. The structural model is not hidden in some obscure software owned by a manufacturer. Further the suppliers of the general purpose frame analysis software will be under increasing pressure to further develop the back-end capabilities of their software as their software will be the ultimate bottleneck in the whole process. So why expend effort re-inventing the wheel? These software developers already have 80% or more of what is required for the back-end of frame analysis, let them add the missing features.

The current major bottleneck is building the model for use in the available software when it comes to the common structural forms of the MSP's. However some manufacturers may be better served by a stand alone structural analysis package with integrated back-end highly customised to a specific MSP.

Simple Back-End
Therefore to provide for experimentation with the back-end of frame analysis I have thrown together a simple MS Excel template. The template just has a single button which reads the results file generated by cpframe and writes them into the cells of a single worksheet. Once the results are inside a worksheet, they can be linked to other cells which are used for the sizing of members, the checking of connections and the sizing of footings. To deal with multiple load cases however it would be better to read the results into an array and process all the load cases using vba. It would generally be preferable to avoid performing calculations in a worksheet unless a relatively tidy presentation can be developed: as the number of structural members and load cases increases, such becomes increasingly impractical.

I see both the front-end and back-end being developed entirely in vba or other programming language. Whilst it is possible to do the calculations in the worksheet it just becomes increasingly prone to error and a nightmare to manage. Why repeat a calculation in 10 cells by copying one cell, when can write the formula once and place in a loop and be sure all 10 calculated results are based on the same formula at all times. Copying cells is prone to unexpected changes in cell references. Such changes may be easy to spot after the fact, but not always fresh in the users mind when they are copying the cells.

Worksheet calculations are useful for checking the operation of vba code and otherwise testing and breaking vba functions by attempting to supply invalid data. For example testing a function for the case of division by zero, has it been covered? What other inputs can break the function? All easier to test grabbing input parameters from the worksheet. Whether actually does work when called from vba is another matter, as the features available to handle errors in a worksheet cell are not valid when executing solely with in vba.

Anycase the back-end template is:

backEndPFrame.xls

DISCLAIMER :
Users of the software must accept this disclaimer of warranty :
The software is supplied as is. The author disclaims all warranties, expressed or implied, including without limitation, the warranties of merchantability and of fitness for any purpose. The author assumes no liability for damages, direct or consequential, which may result from the use of the software.



Revisions:
[04/02/2014] Original
[23/04/2016] Changed download links to MiScion Pty Ltd Web Store