As indicated else where across my blog, I hold an idealistic view of engineering. That idealistic view states that engineering only takes place at the frontiers of science and technology. I believe that this view is held out by history, and this started to be seriously distorted by the middle of the 1900's when the focus became on engineer as a licensed profession. Still I consider the WFEO Washington Accord, defining the engineer in terms of education, to support my view that engineering is at the frontier rather than concerned with established technologies. Most people with a B.Eng don't put the intent of their education to use, and instead become guardians of safety assessing suitability of proposed adaptations and implementations of the established generic technologies: technicians with an abstract and esoteric box of tools and techniques.
I believe that people have grabbed the wrong end of the stick when they consider Telford, Navier and Stephenson to be engineer's simply because they designed and built bridges. People were designing and building bridges long before they arrived on the scene, and people without the title engineer were designing and building bridges during their time.
The important factor is that they were operating at the limits of human knowledge and past experience, they were stepping into the unknown and the potential success of their endeavours was uncertain to them: however they weren't making wild guesses, they were progressing in a disciplined and learned manner. The past experience coming from the works of Vitruvius and the means of moving forward coming from Desaguliers Experimental Philosophy.
Telford was making use of materials and structural forms not previously used, He tested the materials and otherwise built smaller prototype versions of his bridges. These prototypes provided practice in the construction process, tested the concept and provided something with which to communicate the objective to workers on the larger projects.
Navier made use of untested mathematical theory which turned out to be a big mistake. But once he had validated the theory and calibrated it against reality it became a useful and productive theory for the design of all manner of beams.
Robert Stephenson along with William Fairbairn built prototype segments of a tubular bridge in a workshop and tested, and otherwise through trial and error resolved problems concerned with buckling of plates. With the services of Eaton Hodgkinson providing a mathematical assessment of the proposed bridge based on the then developing structural theories. Still a prototype bridge was built first before tackling the main project.
Today the technical science for establishing the suitability of a proposed bridge is well established. Whilst fitness-for-function is a matter of subjective judgement, technical science is available to assess whether the desired performance can be achieved from a given proposal. The uncertainty and risk of failure are low: but only so long as persons highly conversant in the technical science and the technical characteristics of the proposed variant of a generic technology are responsible for assessing and approving the proposed design.
When at the frontier, there is no expert to turn to, no literature with the answer, the answer has to be extracted from nature itself, and that requires trial and error experimentation. But experiments themselves can be dangerous. So a disciplined, rational and controlled approach needs to be taken to the experimentation. The trials and errors are not a result of wild guesses, but thoughtful consideration.
The ingenious contriver of civilisation asks questions and goes in search of answers. They do not sit on the authority of their formal education and approved license, for such is trite and inadequate for their role as pioneer pushing forward the frontier.
Society however is not asking for any frontiers of science and technology to be pushed forward. In the main they simply seek the proper implementation of the established technologies. With respect to these established technologies people have certain expectations, some reasonable others unreasonable and impractical. In terms of the reasonable, people do not expect: the wheels to fall off cars, they don't expect to fall through the floor of their house, they don't expect bridges to collapse when they drive over them, they don't expect ships to sink, or planes to fall out off the sky. These are established technologies and we can take reasonable steps to ensure they perform as expected.
Engineering science and Engineering made these technologies feasible in the first instance, they defined the generic class of technologies from which variants can be developed. The engineering is for all intents and purposes is over. Sure there are still frontiers associated with these technologies, but its a long journey through extensive literature before bump into the current frontier. Civil, industrial, mechanical, electrical, chemical these are all technologies not disciplines of engineering. Engineering is at a frontier, it is not yet classified, and when it is, then the engineering's over: that's the point of science and engineering. Mechatronics is not a new engineering discipline, it is a new area of technology.
Industry needs people who are conversant with the established technologies and who are able to adopt, adapt and apply these technologies to achieve specific objectives. This is not engineering it is technical design. Engineers the likes of Smeaton and Coulomb operated at the frontiers where they had no access to appropriate technical science, they developed the technical science and published papers and presented lecturers to share such knowledge. The published papers could be read by others and the theories contained within put to work. But most importantly such published papers can be referenced by others, they set a benchmark. For example no one should get buried in a trench because we have the technical science to design a technical solution to avoid collapse of the trench walls. If a trench wall collapses we can reference national standards, safety manuals and industry manuals and a variety of textbooks, reference manuals and journal articles. The collapse of a trench wall is largely an avoidable event, and the literature provides the means to avoid. There may be uncertainty in the characteristics of the materials and the quality of the workmanship but such uncertainty can be kept to a minimum. If there is a trench collapse we can identify that the persons involved failed to exercise adequate duty of care.
Information is being consolidated and organised and disseminated faster than ever before. It is important therefore that the available information is used to properly assess new implementations and adaptations of the established technologies.
Unfortunately there is also a problem of information overload which hinders getting anything done. Only the real world physical system is fully informed about itself. Anything else can only contain partial information, the importance of design is to make abstract and give consideration to the critical characteristics: not attempt to simulate a complete virtual reality due to inability to make decisions in the face of uncertainty.
Recommended Reading:
1) J.E.Gordon (1991), Structures, or why things don't fall down.Penguin
2) Jacques Heyman, (1999), The Science of Structural Engineering, Imperial College Press
3) Stephen P Timoshenko (1983), History of the Strength of Materials, Dover
4) S.C.Hollister (1966), Engineer: ingenious contriver of the instruments of civilization, Macmillan career book
A journal on everything technological and everything to do with structure: from building structures, to organisation structures, politics, education, and business. If it has structure I will essay it, if it ought to have structure I will essay it. If it don't have structure and it is chaos, I essay that too!
Monday, June 16, 2014
Sunday, June 08, 2014
Metamorphs: Origin of the name
Back in 1996 I was getting bored, that was only getting to do structural drafting whilst had interests and capabilities, and qualifications beyond that. I was also opposed to the continuous improvement movement which seemed to be missing the perspective that a business needs to be dynamic and adaptive to respond to the dynamic environment in which it operated. An environment which is transformed by the very presence and action of the business enterprise. A world of chaos and uncertainty. Fluid as the imagination, like swirling paint mixing colours, the opening sequences to both Dr Who and the original The Tomorrow People. Not static: dynamic and moving, evolving. That utopia is static, and unsustainable, and is no where, and if somewhere, it is no where I want to be, and want to have no part in creating. Utopia if it is to be somewhere has to be dynamic: a dynamic utopia {something I started to write about in 1986}. A dynamic utopia is adaptive and evolving: and is built upon appropriate technologies best adapted to the environment in which it has to exist. Dynamic utopia can provide the basis of building habitat in space, and marching across the universe: to boldly go ... That the journey is more important than the destination. If the Sun has something like 6 billion years left (the number seems to vary), then have 2 billion years to find a new planetary home, 2 billion years to return and say, hey we found it, and 2 billion years to return. Just one little problem, if can survive travelling through space for 2 billion years, who needs a planet. {This thought came to mind watching the original Battle Star Galactica. Did they really need to get to their destination, and what then?} It is the exploration, the journey that has value, that is life.
So with this swirling chaos in my mind I wrote down a multitude of names that I dreamed up, one of which was Zhymekt {serious holistic industrial and mechanical technology}. Thus the names involved changes to letters of the alphabet, but main objective was to get reference to industrial and mechanical, to consider appropriate and adaptive technology, with implication of fluid imagination and Morpheus shaper of dreams. With a pocket book full of possible names, I went to the business names registry office to check the availability and register "Adaptive Technology", my preferred choice. At that point in time could only check the names at the registration office and that required a trip to the city {Now can check names on line, and nationally}. Already procrastinated long enough, so it was going to be a day of action, check availability and get it registered. Adaptive Technology, was already taken and so were many variants, and so Metamorphs was checked and became the instant replacement and was registered: for a business or rather enterprise that was going to pursue research and develop parametric generic models of adaptive technologies, and some other idealistic hyperbole.
Metamorphs produces, but it doesn't really trade. I make my living (?) through my fathers business Roy Harrison and Associates, which in 2009/2010 we went to register on the national companies register and found the name was taken, and so came up with MiScion Pty Ltd {which can be interpreted as: me off shoot (1.Scion 2.Scion). Except that the word "me", is pronounced with with the letter "i" having the sound as in the word "pit".}. Locally we still have registration of Roy Harrison and Associates as a business name, just not suitable for national usage.
So Metamorphs is where I pursue everything beyond structures: and doesn't have anything to do with MiScion Pty Ltd or Roy Harrison and Associates: though all the structural software I create and we use in the business has the name Metamorphs on it.
Any case the following are the two original descriptions for the name:
Morph : Form and structure
=> Transcending above and beyond structure
Implications:
•The meanings of words are unclear and imprecise.
•The definition of an object can either be fixed and refer to a clearly defined instance of that object.
•Or the definition can be unclear and refer to multiple variations of a generic form.
•Some people have open minds and can see the generic nature of objects, others cannot
•
•Services not restricted to Structural Engineering Drafting & Design
•Structure is an abstract concept and can apply to objects other than buildings
•From an Industrial Engineering viewpoint Organisation structure is not fixed and can be changed
•
•Analytical Computer models are dependent initially on structured view of an object. But these models should be parametric, flexible,adaptable, and amorphous.
•The Language of Object Oriented Programming (OOP) provides the terminology for discussing an object oriented universe.
•Defining a specific instance of an object, or a generic object that is a description of a collection of like objects, is a never ending task of refinement.
•Language itself is dynamic, and the meanings of words change with time
•Objects do not have to be physical material things. They can be abstract ideas.
•
•Laws of thermodynamics suggest that mass and energy can neither be created nor destroyed just transformed from one form to another.
•The Gaia hypothesis suggests multiple levels of organisation structure and the emergence of higher forms of life. Cells congregate into multicellular lifeforms. These multi-cellular lifeforms congregate into societies, villages, cities, states, nations.
•The Gaia hypothesis suggests that there is a two way interaction between life form and environment. The mere presence of a Life form in an environment changes the environment. The life form needs to be continuously adapting to the constantly changing environment
Most of the form and structure that we perceive in the universe does not exist in reality, and is imposed by our own imaginations. If it does exist then it is transient, and exists for a short time frame. Even our perceptions and understanding change with time. So that patterns that were perfectly clear and obvious yesterday become a complete mystery to us today.
Metamorphs seeks out abstract generic definitions of objects. With such generic definitions perceptions of need can be transformed, undergo metamorphosis, and give birth to radically new objects.
Just as carbon, under extremes of temperature and pressure, deep within the earth can be transformed into diamond, and caterpillars emerge from their cocoons as butterflies. So too, it is possible, for engineers to transform the dust of the earth into valuable live sustaining resources. To an Engineer the Physical World is as Fluid as the Imagination.
Metamorphs seeks out abstract generic definitions of objects. With such generic definitions perceptions of need can be transformed, undergo metamorphosis, and give birth to radically new objects.
A bicycle becomes a rickshaw, becomes electrified railway, becomes telegraph, becomes internet. The need for transportation is transformed and displaced by the need for communication.
Confucius stated that "learning starts with the precise meaning of words". However the ancient Greek metaphysics philosopher Cratylus considered that change was so complete and total that he believed that the meanings of words change the very instant that you uttered them.
The word Broadcast is an example of a word where our common perceptions of its meaning has changed with time. In the 1880's the common frame of reference was farming, and it meant to scatter seed widely with the hand. Today the frame of reference is information cast broadly by electronic mass media in the form of radio and television, and now the internet.
The meanings of words change through analogy and metaphor. With time the origin of the meaning becomes lost, and the metaphor reflects the common understanding of a word.
What is this ? If you say pencil, then you know its name, what we call it, but not what it is.
It is : The existential brain cell of a Cray supercomputer. An instrument for communication through visual symbols. A graphite based electrical resistor. A craft tool for sculpting clay. It is raw material.
What it is, is determined by the environment in which it is found. The needs, imagination's , and adaptive capabilities of those who find it available as raw material.
Once a product, whether it be a goods or service, hits the market place, the public arena, its purpose and function changes. The original design specification and intent become obsolete. And a need arises to redesign the product to match its actual usage, but once released, the process repeats itself, in a never ending cycle.
Revisions:
[08/06/2014] : Original Post
[11/07/2015] : Minor Formatting changes, and links added.
So with this swirling chaos in my mind I wrote down a multitude of names that I dreamed up, one of which was Zhymekt {serious holistic industrial and mechanical technology}. Thus the names involved changes to letters of the alphabet, but main objective was to get reference to industrial and mechanical, to consider appropriate and adaptive technology, with implication of fluid imagination and Morpheus shaper of dreams. With a pocket book full of possible names, I went to the business names registry office to check the availability and register "Adaptive Technology", my preferred choice. At that point in time could only check the names at the registration office and that required a trip to the city {Now can check names on line, and nationally}. Already procrastinated long enough, so it was going to be a day of action, check availability and get it registered. Adaptive Technology, was already taken and so were many variants, and so Metamorphs was checked and became the instant replacement and was registered: for a business or rather enterprise that was going to pursue research and develop parametric generic models of adaptive technologies, and some other idealistic hyperbole.
Metamorphs produces, but it doesn't really trade. I make my living (?) through my fathers business Roy Harrison and Associates, which in 2009/2010 we went to register on the national companies register and found the name was taken, and so came up with MiScion Pty Ltd {which can be interpreted as: me off shoot (1.Scion 2.Scion). Except that the word "me", is pronounced with with the letter "i" having the sound as in the word "pit".}. Locally we still have registration of Roy Harrison and Associates as a business name, just not suitable for national usage.
So Metamorphs is where I pursue everything beyond structures: and doesn't have anything to do with MiScion Pty Ltd or Roy Harrison and Associates: though all the structural software I create and we use in the business has the name Metamorphs on it.
Any case the following are the two original descriptions for the name:
Version 1:
Meta : Transcending above and beyondMorph : Form and structure
=> Transcending above and beyond structure
--------------------------------------------------------------------------------
Implications:
•The meanings of words are unclear and imprecise.
•The definition of an object can either be fixed and refer to a clearly defined instance of that object.
•Or the definition can be unclear and refer to multiple variations of a generic form.
•Some people have open minds and can see the generic nature of objects, others cannot
•
•Services not restricted to Structural Engineering Drafting & Design
•Structure is an abstract concept and can apply to objects other than buildings
•From an Industrial Engineering viewpoint Organisation structure is not fixed and can be changed
•
•Analytical Computer models are dependent initially on structured view of an object. But these models should be parametric, flexible,adaptable, and amorphous.
•The Language of Object Oriented Programming (OOP) provides the terminology for discussing an object oriented universe.
•Defining a specific instance of an object, or a generic object that is a description of a collection of like objects, is a never ending task of refinement.
•Language itself is dynamic, and the meanings of words change with time
•Objects do not have to be physical material things. They can be abstract ideas.
•
•Laws of thermodynamics suggest that mass and energy can neither be created nor destroyed just transformed from one form to another.
•The Gaia hypothesis suggests multiple levels of organisation structure and the emergence of higher forms of life. Cells congregate into multicellular lifeforms. These multi-cellular lifeforms congregate into societies, villages, cities, states, nations.
•The Gaia hypothesis suggests that there is a two way interaction between life form and environment. The mere presence of a Life form in an environment changes the environment. The life form needs to be continuously adapting to the constantly changing environment
Most of the form and structure that we perceive in the universe does not exist in reality, and is imposed by our own imaginations. If it does exist then it is transient, and exists for a short time frame. Even our perceptions and understanding change with time. So that patterns that were perfectly clear and obvious yesterday become a complete mystery to us today.
Metamorphs seeks out abstract generic definitions of objects. With such generic definitions perceptions of need can be transformed, undergo metamorphosis, and give birth to radically new objects.
--------------------------------------------------------------------------------
Change one thing and you change everything !
Version 2:
Transcending above and beyond Structures.Just as carbon, under extremes of temperature and pressure, deep within the earth can be transformed into diamond, and caterpillars emerge from their cocoons as butterflies. So too, it is possible, for engineers to transform the dust of the earth into valuable live sustaining resources. To an Engineer the Physical World is as Fluid as the Imagination.
Metamorphs seeks out abstract generic definitions of objects. With such generic definitions perceptions of need can be transformed, undergo metamorphosis, and give birth to radically new objects.
A bicycle becomes a rickshaw, becomes electrified railway, becomes telegraph, becomes internet. The need for transportation is transformed and displaced by the need for communication.
Confucius stated that "learning starts with the precise meaning of words". However the ancient Greek metaphysics philosopher Cratylus considered that change was so complete and total that he believed that the meanings of words change the very instant that you uttered them.
The word Broadcast is an example of a word where our common perceptions of its meaning has changed with time. In the 1880's the common frame of reference was farming, and it meant to scatter seed widely with the hand. Today the frame of reference is information cast broadly by electronic mass media in the form of radio and television, and now the internet.
The meanings of words change through analogy and metaphor. With time the origin of the meaning becomes lost, and the metaphor reflects the common understanding of a word.
What is this ? If you say pencil, then you know its name, what we call it, but not what it is.
It is : The existential brain cell of a Cray supercomputer. An instrument for communication through visual symbols. A graphite based electrical resistor. A craft tool for sculpting clay. It is raw material.
What it is, is determined by the environment in which it is found. The needs, imagination's , and adaptive capabilities of those who find it available as raw material.
Once a product, whether it be a goods or service, hits the market place, the public arena, its purpose and function changes. The original design specification and intent become obsolete. And a need arises to redesign the product to match its actual usage, but once released, the process repeats itself, in a never ending cycle.
Revisions:
[08/06/2014] : Original Post
[11/07/2015] : Minor Formatting changes, and links added.
Monday, April 14, 2014
My spreadsheets DAO and 64 bit Windows 7
In process of reluctantly moving over to Windows 7, found a problem with my spreadsheets. Didn't want to change operating systems, but faulty power adapter connection forced me to get a new computer, that was about 18 months ago. Thus far the 64 bit Windows 7 computer been a good dust collector, as little installs on it, and it is otherwise slower than a wet weekend. What's with the spinning cursor, just execute the command already, instead of unnecessarily animating stuff: supposed to be a general purpose personal computing device not video game. Software developers seem to have lost the plot about what a personal computer is, and difference with a micro-computer and the importance of stable infrastructure. Not to put too fine a point on it but a virus is an annoying irritating piece of software which interrupts the users use of their computer: eg. Microsoft updates are a virus. All computer seems to have done for past 18 months is download updates: switch on to move software licenses, cannot do anything because updates need installing, runs out of battery power needs recharging, go do something else. Get more spare time, repeat same process, during which process seems to have got infected or otherwise messed up. Internet explorer dead, camera not functioning, and cannot install software for. Seems like may have to wipe clean and re-install from scratch. And there's all this rubbish about XP being defunct: its working perfect. Any way write more about operating systems and personal computers later.
Since I don't have 64 bit office, not even current for that matter (why would it need to be, I customize my own stuff), it seems that the Microsoft DAO 3.6 object library gets placed elsewhere. Normally it would be:
C:\Program Files\Common Files\microsoft shared\DAO\dao360.dll
On 64 bits using 32 bit software its likely to be:
C:\Program Files (x86)\Common Files\microsoft shared\DAO\dao360.dll
So to use my technical library will need to manually change the reference in the vba editor under tools references.
When first started with computers I avoided Microsoft products and stuck with Borland (Turbo Pascal, Turbo C, Quattro Pro, Paradox). I cannot go back down that path, but considering the free software options which are available on both Windows and Linux platforms. Start with them on Windows get used to them, and then move over to Linux. Zorin OS may be something worth looking at. There are a lot of problems to move over to Linux, from major applications (AutoCAD, Multiframe) to small utilities (xyplorer, beyond compare). But back in the early 1990's most high end graphical engineering software required a unix workstation: and likely a specific workstation like a Sun sparc. The problem being drivers for graphics, mice, and printers. At DOS every software developer needed to write drivers for their own application, and similarly with unix/Sun OS locked to specific or limited range of hardware. MS Windows eliminated part of the problem, if the computer could run Windows then chances are could run specialist software.
The problem we have now though is not about supplying to meet needs, but changing stuff to acquire future income. Need and demand been satisfied, so what can be done to generate income tomorrow. I know, lets knock all the bridges down, along with the power stations and water filtration and pumping stations. Got a new fangled super material and I reckon it would be unsafe, a major national security risk, to keep all these things made out off that rubbishy concrete stuff. Whilst we're at it lets rip up those copper wires and optical fibre rubbish: you know the ones that connect the internet. And those tarmac, asphalt and concrete pavements they're ugly so lets get rid off them to: I'm sure we can think of something to replace them with, might but half defective and inferior. Wheels yeah, lets re-invent them, triangular ones they'd look good: wouldn't work, but who cares its change, and shouldn't resist change. There is just so much garbage on the internet about upgrading, and resistance to change. You do not destroy the heritage, the infrastructure the foundations on which everything is built.
Windows 8 apparently would be faster than Windows 7, and possibly fix the speed issue I have, but its likely more incompatible with the software I want to run. All I needed was new hardware. Software is expensive and bought progressively over the years: but having got, its not so easy to put aside and replace progressively.
On the other hand I did have a personal policy that engineering stuff I'm supposed to be able to do with pencil and paper, we should develop software in-house for and not depend on commercial software. So CAD or graphical editor is really only obstacle to moving to different operating system: most anything else just requires a compiler or spreadsheet.
However, there is no point to changing operating systems or application software if it is automatically updating on a regular basis. Once installed I expect the software to remain unchanged unless I find a problem with it. As for the internet and security, it is not necessary to be connected to the internet all the time, or have every device connected to the Internet.
We still operate DOS boxes because some software no longer available, or if new Windows version is available there is no benefit which would justify the expense, and otherwise doesn't function at Windows command prompt. I don't believe that kind of updating should have occurred, after all MS DOS ran from a 720 kbytye 3.5" floppy disk, so it should have been possible to fully accommodate in the 1 o 2 Gbyte bloat of the current Windows operating systems. From memory Sun OS had a full MS DOS installation which opened in a separate process window.
So it shouldn't be a problem to have an operating system which spawns different operating environments in isolated containers visually indicated by windows. I think exists already and its called Unix.
Since I don't have 64 bit office, not even current for that matter (why would it need to be, I customize my own stuff), it seems that the Microsoft DAO 3.6 object library gets placed elsewhere. Normally it would be:
C:\Program Files\Common Files\microsoft shared\DAO\dao360.dll
On 64 bits using 32 bit software its likely to be:
C:\Program Files (x86)\Common Files\microsoft shared\DAO\dao360.dll
So to use my technical library will need to manually change the reference in the vba editor under tools references.
When first started with computers I avoided Microsoft products and stuck with Borland (Turbo Pascal, Turbo C, Quattro Pro, Paradox). I cannot go back down that path, but considering the free software options which are available on both Windows and Linux platforms. Start with them on Windows get used to them, and then move over to Linux. Zorin OS may be something worth looking at. There are a lot of problems to move over to Linux, from major applications (AutoCAD, Multiframe) to small utilities (xyplorer, beyond compare). But back in the early 1990's most high end graphical engineering software required a unix workstation: and likely a specific workstation like a Sun sparc. The problem being drivers for graphics, mice, and printers. At DOS every software developer needed to write drivers for their own application, and similarly with unix/Sun OS locked to specific or limited range of hardware. MS Windows eliminated part of the problem, if the computer could run Windows then chances are could run specialist software.
The problem we have now though is not about supplying to meet needs, but changing stuff to acquire future income. Need and demand been satisfied, so what can be done to generate income tomorrow. I know, lets knock all the bridges down, along with the power stations and water filtration and pumping stations. Got a new fangled super material and I reckon it would be unsafe, a major national security risk, to keep all these things made out off that rubbishy concrete stuff. Whilst we're at it lets rip up those copper wires and optical fibre rubbish: you know the ones that connect the internet. And those tarmac, asphalt and concrete pavements they're ugly so lets get rid off them to: I'm sure we can think of something to replace them with, might but half defective and inferior. Wheels yeah, lets re-invent them, triangular ones they'd look good: wouldn't work, but who cares its change, and shouldn't resist change. There is just so much garbage on the internet about upgrading, and resistance to change. You do not destroy the heritage, the infrastructure the foundations on which everything is built.
Windows 8 apparently would be faster than Windows 7, and possibly fix the speed issue I have, but its likely more incompatible with the software I want to run. All I needed was new hardware. Software is expensive and bought progressively over the years: but having got, its not so easy to put aside and replace progressively.
On the other hand I did have a personal policy that engineering stuff I'm supposed to be able to do with pencil and paper, we should develop software in-house for and not depend on commercial software. So CAD or graphical editor is really only obstacle to moving to different operating system: most anything else just requires a compiler or spreadsheet.
However, there is no point to changing operating systems or application software if it is automatically updating on a regular basis. Once installed I expect the software to remain unchanged unless I find a problem with it. As for the internet and security, it is not necessary to be connected to the internet all the time, or have every device connected to the Internet.
We still operate DOS boxes because some software no longer available, or if new Windows version is available there is no benefit which would justify the expense, and otherwise doesn't function at Windows command prompt. I don't believe that kind of updating should have occurred, after all MS DOS ran from a 720 kbytye 3.5" floppy disk, so it should have been possible to fully accommodate in the 1 o 2 Gbyte bloat of the current Windows operating systems. From memory Sun OS had a full MS DOS installation which opened in a separate process window.
So it shouldn't be a problem to have an operating system which spawns different operating environments in isolated containers visually indicated by windows. I think exists already and its called Unix.
Monday, April 07, 2014
Wind Loading Surface Roughness Length versus Terrain Category
A simple spreadsheet making use of AS1170.2:1989 Appendix E. This appendix provided formula for converting surface roughness length (z0) into terrain category and for region A the calculation of the terrain adjusted wind speed or otherwise the terrain category multiplier (Mz,cat).
AS1170.2 commentary contains a design chart which identifies various terrains and gives the surface roughness length (z0). From this chart it is apparent that most rural properties are not TC3 but also they are not TC2. By the use of z0 an intermediate terrain category can be calculated and more economical designs are possible.
The spreadsheet can be downloaded here: surfaceRoughnessLength.xls
Revisions:
[07/04/2014] Original
[23/04/2016] Changed download links to MiScion Pty Ltd Web Store
AS1170.2 commentary contains a design chart which identifies various terrains and gives the surface roughness length (z0). From this chart it is apparent that most rural properties are not TC3 but also they are not TC2. By the use of z0 an intermediate terrain category can be calculated and more economical designs are possible.
The spreadsheet can be downloaded here: surfaceRoughnessLength.xls
Revisions:
[07/04/2014] Original
[23/04/2016] Changed download links to MiScion Pty Ltd Web Store
Wind Loading Risk Assessment
A simple spreadsheet which allows varying life expectancy and calculates the mean return period (R) for the regional wind speed. It is based on formula in Wind Loading of Structures by John Holmes.
It also attempts to map the calculated mean return period to the nearest building code of Australia (BCA) importance level, and return the associated mean return period for such. This part is limited to non-cyclonic regions: A and B.
The file can be downloaded here: windRiskAssessment2014.xls
Revisions:
[07/04/2014] Original
[23/04/2016] Changed download links to MiScion Pty Ltd Web Store
It also attempts to map the calculated mean return period to the nearest building code of Australia (BCA) importance level, and return the associated mean return period for such. This part is limited to non-cyclonic regions: A and B.
The file can be downloaded here: windRiskAssessment2014.xls
Revisions:
[07/04/2014] Original
[23/04/2016] Changed download links to MiScion Pty Ltd Web Store
Wind Loading BCA Importance Levels
A simple spreadsheet that maps Building Code of Australia (BCA) importance levels against the annual probabilities of exceedance (actually mean return periods). Excel trend line facilities are then used to get a formula so that any mean return period can be mapped to an importance level.
Whilst AS1170.2 allows for any mean return period, the BCA has limitations, which are not always suitable, starting with the fact that the BCA is primarily about habitable buildings and anything which is not a habitable building becomes BCA class 10. However the BCA is not suitable for design of any structure classified as class 10. For example a garden shed or carport may be suitable to be designed to BCA volume 2, but a 600m high radio mast is not.
Additionally importance level 2, with mean return period (R) of 500 years is not suitable for all buildings, and then again an importance level of 1 (R=100 years) is not always suitable, therefore helpful to identify alternative importance levels: to show that have importance between 1 and 2. For those who prefer importance levels.
The spreadsheet can be downloaded here. bcaImportanceLevels.xls
Revisions:
[07/04/2014] Original
[23/04/2016] Changed download links to MiScion Pty Ltd Web Store
Whilst AS1170.2 allows for any mean return period, the BCA has limitations, which are not always suitable, starting with the fact that the BCA is primarily about habitable buildings and anything which is not a habitable building becomes BCA class 10. However the BCA is not suitable for design of any structure classified as class 10. For example a garden shed or carport may be suitable to be designed to BCA volume 2, but a 600m high radio mast is not.
Additionally importance level 2, with mean return period (R) of 500 years is not suitable for all buildings, and then again an importance level of 1 (R=100 years) is not always suitable, therefore helpful to identify alternative importance levels: to show that have importance between 1 and 2. For those who prefer importance levels.
The spreadsheet can be downloaded here. bcaImportanceLevels.xls
Revisions:
[07/04/2014] Original
[23/04/2016] Changed download links to MiScion Pty Ltd Web Store
Wednesday, March 12, 2014
Revised Links to downloads
The links to applications and MS Excel workbooks which were stored on the personal web space which came with my Internet access account have now been replaced with links using Dropbox. So the downloads should be available again.
Saturday, March 08, 2014
In search of South Australian Building Industry Web Presence
The following is a list of various suppliers to and in the South Australian building industry. It is not a recommendation, it is just a search for suppliers I am familiar with and an investigation to what web presence they have, whether it is a web site, presence on social networks, or simply a listing in the white pages and or yellow pages telephone directories. Some of these businesses we have done design/engineering work for in the past, or otherwise specify their products, others simply have a presence in the market place which cannot be ignored.
Cold-Formed Steel Sheds:
Cold-Formed Steel Sheds:
- Erecta Shed
- Mark Lattin Steel Constructions
- Alpha Industries
- Delta Sheds {No longer trading}
- Galpruffe {No longer trading}
- Olympic Industries
- Magnus Australia
- Cockaleechie Industries
- Fielders Endurance Structures
- Ranbuild
Cold-Formed Steel
Panel and Block Construction
- Ramset
- Hilti
- Pryda {nail plates}
- MiTek Building Systems {Gangnail, nail plates}
- James Hardie
- Nobles {Cables and Rigging}
Retaining Walls
Water Tanks
Soil Bore Logs
Land Surveying
Building Surveyors &/or Private Certifiers
Drafting and Design Services
Work Shop Detailers
{No web presence found at moment for those I know}
Pergolas, Verandas, other Canopies and Decking
{Mostly Timber}
- Pro-Form Pergolas
- Alfresco Pergolas
- Pergolas of Distinction
- Outdoor Innovations
- Craig Potter Construction {Steel}
- Revolution Roofing (Steel)
Balustrades, and Pool Screens
General Metal Fabricators
Houses and other Residential Construction
Electrical
Golf Nets and other Sports Nets
Consulting Engineers for Infra-Structure Size Projects
Thursday, March 06, 2014
Something gone wrong with personal web space!
It seems my personal web space which comes with my Internet account is no longer accessible. Whilst the html pages won't display it seems the files I made available for down load are still available. I can also access the site and via ftp and upload more files. But otherwise cannot display the html.
Also two ftp applications weren't all that clear and gave the impression that site files existed twice. I thought that was the problem so deleted one copy. That was a bad idea, as seems there was only one copy, and now deleted that. I have put back some of the content, but as the personal brain knowledge base took about an hour to load in the first place, I haven't put that back {Only took 3 minutes to delete}.
Any case I was considering putting the files elsewhere, and use a more general file sharing platform to make the files available. So will update the posts and the links when I have that sorted out.
It also seems preferable to avoid using the personal web space, whilst got the benefit of being able to write html as I please, the hyper-link to the site is longer than if use a platform like blogger or WordPress.
Though have two problems with blogging platforms:
1) Embedding images
2) Linking to files.
Google photos doesn't seem to have the link and embed codes available that Picasa has. There is no way to create folders and upload files on blogging platforms except in limited ways. The limited ways being images to posts. Images for side bars and likes not so easy. So currently removed some links in the side bar showing other web presence.
So will experiment with options to replace web site which was on personal web space. Wasn't really used for anything other than a space I could put files and link to them. The web pages I can probably replicate on WordPress or Blogger.
Also two ftp applications weren't all that clear and gave the impression that site files existed twice. I thought that was the problem so deleted one copy. That was a bad idea, as seems there was only one copy, and now deleted that. I have put back some of the content, but as the personal brain knowledge base took about an hour to load in the first place, I haven't put that back {Only took 3 minutes to delete}.
Any case I was considering putting the files elsewhere, and use a more general file sharing platform to make the files available. So will update the posts and the links when I have that sorted out.
It also seems preferable to avoid using the personal web space, whilst got the benefit of being able to write html as I please, the hyper-link to the site is longer than if use a platform like blogger or WordPress.
Though have two problems with blogging platforms:
1) Embedding images
2) Linking to files.
Google photos doesn't seem to have the link and embed codes available that Picasa has. There is no way to create folders and upload files on blogging platforms except in limited ways. The limited ways being images to posts. Images for side bars and likes not so easy. So currently removed some links in the side bar showing other web presence.
So will experiment with options to replace web site which was on personal web space. Wasn't really used for anything other than a space I could put files and link to them. The web pages I can probably replicate on WordPress or Blogger.
Thursday, February 13, 2014
Plane Frame Analysis Front End Version 4
frontEndPFrame04.xls |
In this way existing data files (.dat) can be opened and edited in Excel. Since reading a file into a workbook over writes the contents of the cells, its not a good idea to calculate the geometry or required loading in the worksheet cells, it is better to generate the model using vba which is the point of the exercise. Using the variables and data structures now defined can write vba code to build a model directly into those variables and by pass the worksheet. The worksheet was just used to verify can read and generate the file through the use of the variables. The file format of the output doesn't match the file read, as the original data files were generated by QPro. Once I had got QPro to generate data files which pframe could read, I didn't worry too much that the file didn't match the exact format of the files saved by pframe. When converted to Excel the format for numbers was taken from the format assigned to the Excel cells, this is no longer done, the format is now hard coded into the vba code. This was done because the file read macro first clears the cells, whilst this has been modified to only clear the contents, the macro can read data files larger than the maximum formatted data area (highlighted in yellow in worksheets) and therefore some format will need assigning to these extra records. The hard coded formats also closely match the original Pascal formats: not liking some of the original formats I have changed them with out affecting the ability of the original application to read the data (thus far).
From this point forward will need to move away from buttons in the worksheet. As far as I know, vba variables have a life as long as the execution of the macro. Thus when macro stops, the variables cease to be accessible, though the memory may not be cleared. This is the reason for storing the contents of the variables to the worksheet, and then reading back from the worksheet. Though I did read recently that global variables do retain values between vba calls: maybe. Before including the subroutines for reading the data into the variables I did trial reading then writing without storing and retrieving from the worksheet first: the two buttons seemed to work. However I put that down to valid data still being in memory, and same chunk of memory being addressed when macro next macro run, in other circumstances the data may not be valid and may address different chunk of available memory. This is based on past experience with Excel 97, though at that time I admit not overly familiar with use of the Public keyword, so past failures to initialise variables via open workbook method may have been because of not having global variables: though that seems it would have complained about unknown variables and the initialisation wouldn't have worked. Not something I really want to test or rely on: so will assume that from one vba call to the next all variables are wiped. Besides, since making the test for this application, I have removed the global variables and made local to the main class.
So on the assumption that variables are wiped from one vba call to the next it is necessary to keep the vba code active, and to do that need to move buttons from worksheet to a vba form: and have the form retain control until all tasks complete. The alternative is to repeat large segments of initialisation code on each vba call. This latter approach was adopted for the wind loading functions contained in schTechLIB, each function calls subroutines which load arrays which are then searched for data: may have to revisit that and see if I can initialise once.
Another problem to be addressed is access to the variables external to the class in which they are defined. The variables are arrays and vba does not allow public arrays in the definition of a class. So whilst converting records into classes in Turbo Pascal and Delphi is relatively easy its cumbersome in vba. An alternative is to use collections in vba, these can be public, however these are highly likely to be incompatible with the analysis routines of plane frame. {Noting that idea is to incorporate plane frame in the code not shell out and pass data file to a console application. The console application is just a development tool, and not affected by any code I write for the front-end creating the model to be passed to the frame analysis. The first task is to auto-generate a valid structural model of a manufactured structural product (MSP): not analyse it, that task comes later.}
If don't use collections then need to write property (Let, Get, Set) methods to access the individual elements of the arrays. At present only written one of the classes in Pascal, vb.net and vba, and use of collections keeps vba most similar to Pascal and vb.net.
So as to which is the better option I will work out as I attempt to gain access to the variables and define a structural model.
Structural Design
At the moment I assuming that structural design is split into three stages:
Stage1
Consists of creating the structural model:
- Dimension and Geometry
- Design Actions
Stage 2
Consists of Analysing the structure and determining the reactions and action-effects. The analysis used depends on the analysis used. The tools used for the analysis depends on the complexity of the analysis.
Stage 3
Consists of:
- Component sizing
- Connection Design
- Footing Design
However the whole process is not all that simple, as connections and footings themselves have their own components and may require repeating stage 1 to stage 3, using different analysis methods suited to the details of the component considered. That is steps 2 and 3 of stage 3 can be removed and these components considered as requiring their own structural models. {eg. connections modelled in isolation along with applied forces, using finite element method}
So cpframe only covers stage 2 for structures which can be modelled as plane frames. The front-end for cpframe covers stage 1, and the back-end covers stage 3. There is plenty of software available for stage 2 and stage 3: though not much choice for software covering Australian materials codes. It is stage 1 however where the problem lies with respect to rapid design of manufactured structural products (MSP). For simple MSP's a few simple calculations in a a spreadsheet will suffice for all stages, for more complex structures some analysis engine is required for stage 2. If using an analysis engine then a structural model needs to be built from simple user input. For structural frameworks the form of the structural model is relatively similar, for example I only need to add a few extra fields, add WriteArc methods to all the classes to export the model to a MicroStran .arc file. These methods I already have in classes specifically written for writing MicroStran files. {Pity I think MicroStran even needs to import the arc file. Otherwise could launch that instead of cpframe.}
So given that the data structures required to represent a structural model are relatively similar I can create my own data structures and use to create models for export to what ever structural software I choose. Note not concerned with picking up the dimension and geometry from a CAD model and adding loads to it: the dimension and geometry are to be auto-generated from a few simple parameters and no other variations permitted. If want other variations then export a model for import into general purpose structural analysis software. An MSP has a defined structural form and a limited set of parameters: if the structural form changes then it is no longer the off-the-shelf MSP. I don't see the purpose of software to permit design at point-of-sale (PoS) but to constrain the options available so that the customer knows they have just defeated the entire point and purpose of going to a supplier of MSP's: and consequently some significant period of design and engineering is not required. On the other hand want to make more customisable options available at the PoS. So that eventually get auto-generation of basic options and then manually edit more custom options: with the allowable editing constrained. {NB: Some advanced level CAD systems for more than 20 years now, have been able to define dimensions by mathematical expressions. For example the span can be set to twice the height. Plus various nodes or points can be used as constraints and references. However that doesn't necessarily equate to being able to increase quantity of components, such as varying the number of portal frames with the length of the building. Also in the building industry they don't pay to much attention to assemblies and sub-assemblies: so the building rather than being a series of portal frames becomes a forest of columns with a collection of rafters. Thus 3D modelling can change the perception of the structure: getting away from the sub-assemblies which make it up.}
Any case the focus at the moment is auto-generation of the structural model, and ultimately being able to analyse that model using various tools to get comparative checks on structural adequacy, so that independent technical checking is viable. In the past and even now, independent checking is a problem, the manufacturers use propriety software generate and check compliance in a few minutes and the city councils engineer certifying the structure may require a week to check the design using general purpose tools: or otherwise simply reading through the reports produced.
So my view is to provide the tools to the certifiers and general designers first for use with general purpose structural analysis tools. This in turn increases the potential for MSP's in the first place, and with MSP's comes the manufacturers desire to limit what the sales team can sell to keep the production economical. So go from the flexible to more and more constraints.
Download frontEndPFrame04.xls .
DISCLAIMER :
Users of the software must accept this disclaimer of warranty :
The software is supplied as is. The author disclaims all warranties, expressed or implied, including without limitation, the warranties of merchantability and of fitness for any purpose. The author assumes no liability for damages, direct or consequential, which may result from the use of the software.
Revisions:
[13/02/2014] Original
[23/04/2016] Changed download links to MiScion Pty Ltd Web Store
Tuesday, February 04, 2014
Future Development of our Plane Frame Program
It is not the intention to write a stand alone frame analysis program. Software such as MicroStran and Multiframe serve that purpose reasonably well.
Similarly it is not the intention to write software supporting a building information model (BIM). Software such as ArchiCAD, Revit Structural, RAM Build, Tekla (XSteel/XEngineer), Master Series Master Portal serve that purpose.
None of this software however is suitable for manufacturers of structural products who do not employ engineers on staff. Employing an engineer on staff is not overly viable: one a shortage and second the problem of a graduate gaining appropriate experience and guidance to do the required job in such places. Also civil engineers are not overly suitable any case they don't have the repetitive manufacturing and product development knowledge. They typically design buildings one at a time for a specific purpose to comply with one set of regulations. Local regulations are irrelevant when designing a product, the product may have to meet regulations in multiple regions with minimum variation to zero variation in its form. On the other hand mechanical/manufacturing engineers lack the building structures knowledge. More importantly an engineer employed on the staff of these manufacturers is likely to get bogged down dealing with the day to day hassles of cleaning up the mess created by the sales people.
The sales people and their customers need to be enabled and empowered to make the right decisions to keep their company out of trouble and avoid hassles for the customer. I've trialled tables: they always want smaller increments to the tables and usually want to opt for the unconservative direction when making a choice. I've tried graphs: there are unusual and incorrect ways to read simple graphs. Also a single graph or table doesn't specify the requirements for a complex assembly, multiple such design aids have to be used and so complicating the process. Hence software is seen as the solution by the manufacturers and they typically seek an Excel spreadsheet. An Excel spreadsheet typically because they know of something similar used else where or because they know we and others use Excel for our calculations.
Spreadsheets with calculations directly in the worksheet are not however a sensible option. It can involve a considerable amount of work just to add one additional structural element. Using arrays or records such additional structural element can be done with relatively minor effort. Therefore using a programming language like vba/Pascal and a database like MS Access or sqlite is more efficient use of computing resources. Efficient use of computing resources potentially becoming important again as the use of smart phones increases, along with Internet services.
From such perspective Java and JavaScript may be considered the more preferable programming languages. As these are more readily available programming languages for the Internet and Android phones.
To convert either the vba or Pascal source to vb.net, I have to convert all arrays to zero based indices. Since Java is largely C like in its syntax I assume it also requires zero based arrays. Therefore could adopt Java in preference to vb.net, however Java is likely to impose more constraints on the use of classes. Further as with Lazarus, Java has the problem that it isn't immediately compatible with the .net framework: and writing either a COM automation server or .net component is going to be more complex than with Visual Studio and vb.net. Since I would like to produce something which can be used in either Excel, MS Access or referenced in some other application, COM or .net is important.
At present however it is easier for me to convert the Turbo Pascal source into using class/objects and to using zero based arrays. To do this in parallel with further development in Lazarus vb.net and vba. At the same time modify the program so that it is based on constructs which make it easier to translate to Java and C#. Where C# provides compatibility with the .net framework and Java with Android: with the two languages being about as similar as vb.net and vba.
The approach may be slow, but it will permit the modules of the intended larger application to remain functional in at least one programming environment at any point in time.
pframe needs a back-end for member design for steel, cold-formed steel, timber and aluminium members. Currently this is only available in vba. Connection design only available in Excel worksheet calculations.
Developing the back-end however I consider to be off secondary importance as such facility to a limited extent is already available in software like MicroStran and Multiframe.
The feature missing from most of the commercial software for the Australian market is auto-generation of the structural model. Sure the software can generate the dimension and geometry for a few common structural forms: but the software doesn't auto-generate the structural model with the loading. Nor is the software able to iterate through various structural models subject to some constraint: such as increase height until some maximum moment is reached. {Though Multiframe is a COM automation server and can be programmed for such task}
With respect to auto-generation of a structural model in the form of data files for pframe and MicroStran that is already written in Delphi and vba. Converting it to vb.net is expected to be relatively easy. However having auto-generation as an integral part of pframe would be a lot better, rather than interacting only via data files. Since converting pframe to vb.net is the obstacle, merging the Turbo Pascal and Delphi source seems the fastest approach: other than the wind loading is obsolete. The wind loading however really needs modifying so that it gets data from files rather than hardcoded arrays: for which purpose XML files maybe suitable.
Similarly it is not the intention to write software supporting a building information model (BIM). Software such as ArchiCAD, Revit Structural, RAM Build, Tekla (XSteel/XEngineer), Master Series Master Portal serve that purpose.
None of this software however is suitable for manufacturers of structural products who do not employ engineers on staff. Employing an engineer on staff is not overly viable: one a shortage and second the problem of a graduate gaining appropriate experience and guidance to do the required job in such places. Also civil engineers are not overly suitable any case they don't have the repetitive manufacturing and product development knowledge. They typically design buildings one at a time for a specific purpose to comply with one set of regulations. Local regulations are irrelevant when designing a product, the product may have to meet regulations in multiple regions with minimum variation to zero variation in its form. On the other hand mechanical/manufacturing engineers lack the building structures knowledge. More importantly an engineer employed on the staff of these manufacturers is likely to get bogged down dealing with the day to day hassles of cleaning up the mess created by the sales people.
The sales people and their customers need to be enabled and empowered to make the right decisions to keep their company out of trouble and avoid hassles for the customer. I've trialled tables: they always want smaller increments to the tables and usually want to opt for the unconservative direction when making a choice. I've tried graphs: there are unusual and incorrect ways to read simple graphs. Also a single graph or table doesn't specify the requirements for a complex assembly, multiple such design aids have to be used and so complicating the process. Hence software is seen as the solution by the manufacturers and they typically seek an Excel spreadsheet. An Excel spreadsheet typically because they know of something similar used else where or because they know we and others use Excel for our calculations.
Spreadsheets with calculations directly in the worksheet are not however a sensible option. It can involve a considerable amount of work just to add one additional structural element. Using arrays or records such additional structural element can be done with relatively minor effort. Therefore using a programming language like vba/Pascal and a database like MS Access or sqlite is more efficient use of computing resources. Efficient use of computing resources potentially becoming important again as the use of smart phones increases, along with Internet services.
From such perspective Java and JavaScript may be considered the more preferable programming languages. As these are more readily available programming languages for the Internet and Android phones.
To convert either the vba or Pascal source to vb.net, I have to convert all arrays to zero based indices. Since Java is largely C like in its syntax I assume it also requires zero based arrays. Therefore could adopt Java in preference to vb.net, however Java is likely to impose more constraints on the use of classes. Further as with Lazarus, Java has the problem that it isn't immediately compatible with the .net framework: and writing either a COM automation server or .net component is going to be more complex than with Visual Studio and vb.net. Since I would like to produce something which can be used in either Excel, MS Access or referenced in some other application, COM or .net is important.
At present however it is easier for me to convert the Turbo Pascal source into using class/objects and to using zero based arrays. To do this in parallel with further development in Lazarus vb.net and vba. At the same time modify the program so that it is based on constructs which make it easier to translate to Java and C#. Where C# provides compatibility with the .net framework and Java with Android: with the two languages being about as similar as vb.net and vba.
The approach may be slow, but it will permit the modules of the intended larger application to remain functional in at least one programming environment at any point in time.
pframe needs a back-end for member design for steel, cold-formed steel, timber and aluminium members. Currently this is only available in vba. Connection design only available in Excel worksheet calculations.
Developing the back-end however I consider to be off secondary importance as such facility to a limited extent is already available in software like MicroStran and Multiframe.
The feature missing from most of the commercial software for the Australian market is auto-generation of the structural model. Sure the software can generate the dimension and geometry for a few common structural forms: but the software doesn't auto-generate the structural model with the loading. Nor is the software able to iterate through various structural models subject to some constraint: such as increase height until some maximum moment is reached. {Though Multiframe is a COM automation server and can be programmed for such task}
With respect to auto-generation of a structural model in the form of data files for pframe and MicroStran that is already written in Delphi and vba. Converting it to vb.net is expected to be relatively easy. However having auto-generation as an integral part of pframe would be a lot better, rather than interacting only via data files. Since converting pframe to vb.net is the obstacle, merging the Turbo Pascal and Delphi source seems the fastest approach: other than the wind loading is obsolete. The wind loading however really needs modifying so that it gets data from files rather than hardcoded arrays: for which purpose XML files maybe suitable.
History of the Development of our Plane Frame Analysis Application
The earliest dated Pascal source files I can find on our systems are dated 1990, however I didn't start using the program until the 1996 version. The 1996 version uses Turbo Gadgets to provide walking menus, and it otherwise uses the full graphics screen to display moment diagrams etc... This I wrote about in a previous post on my blog: since its a DOS based program using the full graphics screen its not overly compatible with windows: works Windows XP but not windows 7.
Back in 1996 the program was the only frame analysis program Roy Harrison & Associates (Now MiScion Pty Ltd) used. A few years later we purchased a MicroStran license, and a few years after that, rather than get a 2nd Microstran license we got a Multiframe license as a comparative check. Once we had the commercial software, this simple plane frame program fell out off use. {The application at this time had names alternating from f_wrk.exe to frame.exe}
Over the years however we have been repeatedly asked if we can supply software to help our clients sales personnel customise their structural products at point-of-sale (PoS). None however were really serious about investing in the development of such software: just an hopeful expectation that since we did everything by computer and we wrote the tools for such, that we could just throw something together. Not that simple.
The program as released here (cpframe), is the console(c) version of plane frame (pframe), I have removed the user interface. I never really used the user interface of the original program: this is because I wrote and used Borland Quattro Pro (QPro) macros to generate the data files. Our original projects involved standard calculations for sheds and carports, subject to a set of conditions but with a requirement to find the maximum height possible. Since wind loading is dependent on height, I needed to calculate new wind loads for each height until I found the limits of the structural section being used.
I could have wrote additional Pascal code and merged with pframe to achieve such objective, however already had QPro spreadsheets for general wind loading calculations, it was therefore faster to write QPro macros to generate the data files for pframe.
Similarly could have wrote more Pascal code to integrate coldformed steel design AS1538/AS4600 into pframe, but once again already had QPro spreadsheets for general design of members and connections. In terms of solving clients current problems had no time to develop a full application in Pascal.
Now whilst QPro could launch pframe, pframe didn't support command line parameters. Therefore I had to manually open the data file and run the analysis, then import results in QPro. Since pframe only supports a single loadcase, I had to repeat this procedure for each loadcase. Pframe was thus a bottleneck in the whole process that I wanted to remove. {This bottleneck was slightly improved once we got MicroStran and I modified the QPro macros to generate microstran .arc files. Microstran could deal with all loadcases and envelope the extreme values, thus reducing the total number of steps compared to using pframe.}
Back in 1996 however I had an aversion to touching the Pascal source code for pframe written by my father and adding the desired command line parameters or otherwise expanding the program. The aversion stemming from not having any simple and rapid means of testing the program to ensure I hadn't messed up the calculations.
So I needed an alternative method. Whilst moment distribution is easy enough to learn for continuous beams, its an otherwise cumbersome approach for plane frames (though extremely practical when there isn't anything else). American texts tend to show moment distribution calculations littered about a diagram of the frame whilst others present a more tabular approach. Either approach seems messy, and prone to error. Since I had a steel designers manual with Kleinlogel formula for the frames I was most interested in, I adopted Kleinlogel formula.
Still using Kleinlogel formlua is time consuming and prone to error, so I set up a spreadsheet to do the calculations. Spreadsheets being slightly cumbersome however, I decided to also program the calculations in Delphi 3 (Pascal).
Parallel to this was the desire to expand the wind loading calculations and make them more complete and not just limited to the few scenarios most often encountered and manually handling other scenarios. Since more familiar working with arrays in high level languages like Fortran, Pascal and C it seemed easier to develop the wind loading functions in Delphi (Pascal) than in the QPro macro language. So wind loading calculations were thus developed in parallel in QPro and Delphi, one used as a check against the other.
Now the problem with QPro macros is that the calculations are not up to date unless the macros had been run, and these were either run when the dialogue boxes collecting data were closed or when an hot key was used. This made the spreadsheets slightly cumbersome, but I had read somewhere that there was potential to create a dynamic link library (DLL) and add functions to QPro and this seemed possible using Delphi. Though I had't read in detail how to do so, and it seemed complex anyway, the potential was there. Hence the parallel developments in Delphi and QPro were not considered wasteful as they were expected to merge at some point in the future.
However, whilst wandering around book stores during my lunch break whilst working on contract in the city, I bumped into a book on Programming Office 97 using Visual Basic for Applications (vba). I had read some articles in computer magazines about Excel/vba but wasn't sure how vba related to Excel, plus I had a bias towards Borland software. Still I bought the book.
I had been hoping that Borland would merge the IDE's for Turbo Pascal, Turbo C and Turbo Basic and ensure they had the same function libraries, had built in language converters, including converters to turn GWBASIC into simple QPro spreadsheets or Paradox applications, and further more would be the programming languages for Paradox and QPro. They kind of did this, they threw Paradox and QPro away and developed Delphi and C++ Builder. Corel QPro I didn't like it was buggy. However, it was our 2nd QPro for windows license, and time was being wasted modifying my Borland spreadsheets to work in the Corel version. I didn't want to solve that problem by getting an additional Corel license. I was looking for an alternative spreadsheet, after reading the book on vba and office 97, I went and bought Office 97 Professional and the systems developer kit (SDK).
The wind loading functions I had programmed in Delphi, I translated into vba, thus making them available as functions in the Excel worksheet. A simple change to a spreadsheet cell and all the wind calculations upto date and the frame analysis using Kleinlogel also completed. Manually changing parameters in the spreadsheet I could quickly map out the potential of all the available c-sections for use in cold-formed steel sheds.
But still had a problem. Translating the dialogue boxes from QPro to Excel 97 wasn't so easy. Connecting the dialogue boxes to cells in Excel seemed cumbersome compared to QPro, I may have been doing it wrong but I had no real references for such. I tried the QPro way and that didn't seem to work: a similar approach does work in Excel 2003. Though there is still an issue of being able to abandon the dialogue box and not automatically updating the worksheet: such was not a problem with QPro1. Besides it appearing cumbersome to allocate data to drop down lists on the dialogue boxes, there was another problem with Excel and that was timing. There seemed to be a timing problem between getting data from dialogue boxes, evaluating user defined functions (UDF) and updating worksheet calculations. Either crashing or simply not calculating the correct answers.
Initially I had tried to replicate the QPro like wind loading macro's making use of the worksheets to store the data tables, but that appeared to be part of the timing problem and therefore I decided to abandon that approach in favour of using arrays in vba. Due to the problems with dialogue boxes, I abandoned them in favour of setting up input forms fully in the worksheet. Once the scope and life times of vba variables were better understood the workbooks worked fine.
But due to the problems encountered with programming vba, development continued in parallel in Delphi. I did attempt to iteratively change the structure height in an Excel worksheet and capture the resultant maximum frame moment and tabulate them. But a clear timing problem in that situation: the height can be changed faster than the worksheet can update the calculations. Incorporating delays could have probably fixed it, but why incorporate delays when objective to get calculated information as quickly as possible. Hence the Delphi program was expanded to iterate through the heights, or through the spans or through both heights and spans and calculate a table of maximum moments.
I then decided to produce a height/span chart and charting in Excel seemed easier than using Delphi graphics. Using Excel I could produce and store tables and charts and any other reporting may want, further more I could control Excel from Delphi. Unfortunately the Excel type library didn't import properly into Delphi due to keyword conflicts. The consequence was that programming Excel from Delphi was being done blind. Bit of a problem as Delphi uses [] for arrays and () for function parameters whilst vba uses () for everything. So that part of the application also needed parallel developments in Excel and Delphi, test what I needed to do in Excel then translate to Delphi.
This however was interrupted by changes to the wind loading code (AS1170.2), and since developing the application was a side line to day to day structural design, it was more important to update the general purpose wind loading Excel workbooks rather than update wind loading in Delphi. As a consequence Delphi was abandoned for developing the height/span charts, and it was all written in Excel/vba, all calculations in vba and thus avoiding problems of timing with worksheet calculation up dates.
Since we had been going down the path of developing a stand alone application in Delphi, pframe was part converted to Delphi to create a windows application with the graphics for the moment diagrams added, but without the rest of the interface developed.
However due to all the member design (AS4600/AS4100/AS1720) all being written in Excel/vba only, and the wind loading up date issue, it was considered that move over to development in Visual Basic might be more productive. So we obtained Visual Studio (VS) 2003 and VB.net, and then to get a second license we ended up with VS 2005.
But spreadsheets are still easier to format reports in, than messing around programming Delphi or VB.net. Sure those who prefer MathCAD type presentations think otherwise: that Excel is poor for presentation. For some reason there was some resistance to pushing forward with Delphi or VB.net development because of resistance to plain ASCII text files (.txt) or rich text files (.rtf), and complications of print preview and getting things to printers. But none of that is really a problem with MS Windows as notepad(.txt) and wordpad(.rtf) are available on each machine. Sure there is a possibility that the user can modify the results: but they would have difficulty proving and replicating such contended error in the program. Further today results are typically sent to pdf files rather than paper print out: and the pdf files can be edited.
Which is another point we had started to trial, generating pdf files, and produce all electronic documents early in the 1990's, but it was cumbersome to use pdf files where we needed results for further calculations. It was far easier to use paper printouts and mark up the required data. Scanning handwritten calculations to incoporate with computer generated calculations also produced massive pdf files, and so electronic documents were abandoned, we didn't have large enough hard disks to store the stuff, and zip disks were expensive. Hence further reason to integrate all the calculations electronically: eliminate the paper print outs for reference and produce smaller pdf files.
VB.net turned out to be significantly different from vba, and therefore it was put aside and pframe was converted to Excel/vba (xlFrame), so that it could interact more directly with Excel worksheet for input data and reporting.
Not long after doing that we were approached to provide a structures module for a carport/verandah spreadsheet. Whilst the structures module is relatively small the spreadsheet itself is relatively large, much of it is data and could probably be better done using MS Access: which is another development track pursued along with Paradox with respect to the materials management.
Now the Delphi application besides using Kleinlogel formula to generate height/span moment tables, it also generates data files for pframe, MicroStran (.arc) as well as AutoCAD LT scripts (.scr), it can also read data files from various other programs written. Much of this has been converted over to Excel/vba and extended further but as separate Excel workbooks. Attempting to gather a lot more vba code together into a single integrated applicataion hit some limit of Excel/vba. Whilst the code seems to run ok, its not possible to work on the modules as attempt to close/save the file it hits a memory limit and basically crashes. It won't save the file except through its recovery feature with all the vba code removed.
Since I don't consider I should use a better computer with more memory, nor that I should reduce the number of vba modules, further development in vba has stalled. Leading me to revisit VB.net.
The expectation was that could simply change the xlFrame into a COM automation object or .net component or similar, which can be plugged into Excel. Then all the parallel developments would disappear as all my wind loading and member design function libraries and the plane frame analysis could all be in vb.net and possibly a single library. Unfortunately that prior problem of the differences between vb.net and vba makes such conversion difficult for the plane frame analysis: though a simple conversion for the function libraries.
Also I want software like the carport/verandah software to export data files compatible with pframe and also to generate MicroStran arc files. When testing the data files exported by xlFrame they were not compatible with the 1996 version of pframe. This led me back to looking for Turbo Pascal source code to compile the original 1996 version of the program, and trace why the new data files were not compatible. Finding source code which compiled and used the same data files as the operational exe file, and which also produced correct results wasn't so easy.
The change in the file format was attributed to a change in how partial loads were defined. The error in the calculations was tracked down to dynamically allocated variables being freed from memory before results stored in those variables was actually used.
So having gone back to Turbo Pascal and given I prefer object pascal compared to vba, especially with respect to arrays in classes, it does seem that further development in Pascal may be the better option than vb.net, with Lazarus being a viable alternative to Delphi. Though a COM automation server may not be so easy to develop in Lazarus as it is in Visual Studio.
Anycase at the moment maintaining parallel developments in Turbo Pascal, Delphi 3, Lazarus, vb.net (VS 2005) and vba as I convert the record data structures into class/objects. The main difference at the moment is that an array has been converted into a collection in vba, though I may convert that back into an array and write property functions to access it. Both these vba approaches seem cumbersome compared to the other languages.
Back in 1996 the program was the only frame analysis program Roy Harrison & Associates (Now MiScion Pty Ltd) used. A few years later we purchased a MicroStran license, and a few years after that, rather than get a 2nd Microstran license we got a Multiframe license as a comparative check. Once we had the commercial software, this simple plane frame program fell out off use. {The application at this time had names alternating from f_wrk.exe to frame.exe}
Over the years however we have been repeatedly asked if we can supply software to help our clients sales personnel customise their structural products at point-of-sale (PoS). None however were really serious about investing in the development of such software: just an hopeful expectation that since we did everything by computer and we wrote the tools for such, that we could just throw something together. Not that simple.
The program as released here (cpframe), is the console(c) version of plane frame (pframe), I have removed the user interface. I never really used the user interface of the original program: this is because I wrote and used Borland Quattro Pro (QPro) macros to generate the data files. Our original projects involved standard calculations for sheds and carports, subject to a set of conditions but with a requirement to find the maximum height possible. Since wind loading is dependent on height, I needed to calculate new wind loads for each height until I found the limits of the structural section being used.
pframe.wb1 |
Similarly could have wrote more Pascal code to integrate coldformed steel design AS1538/AS4600 into pframe, but once again already had QPro spreadsheets for general design of members and connections. In terms of solving clients current problems had no time to develop a full application in Pascal.
AS1538.wb1 |
Back in 1996 however I had an aversion to touching the Pascal source code for pframe written by my father and adding the desired command line parameters or otherwise expanding the program. The aversion stemming from not having any simple and rapid means of testing the program to ensure I hadn't messed up the calculations.
So I needed an alternative method. Whilst moment distribution is easy enough to learn for continuous beams, its an otherwise cumbersome approach for plane frames (though extremely practical when there isn't anything else). American texts tend to show moment distribution calculations littered about a diagram of the frame whilst others present a more tabular approach. Either approach seems messy, and prone to error. Since I had a steel designers manual with Kleinlogel formula for the frames I was most interested in, I adopted Kleinlogel formula.
Still using Kleinlogel formlua is time consuming and prone to error, so I set up a spreadsheet to do the calculations. Spreadsheets being slightly cumbersome however, I decided to also program the calculations in Delphi 3 (Pascal).
Qpro macros for AS1170.2 wind loading |
Now the problem with QPro macros is that the calculations are not up to date unless the macros had been run, and these were either run when the dialogue boxes collecting data were closed or when an hot key was used. This made the spreadsheets slightly cumbersome, but I had read somewhere that there was potential to create a dynamic link library (DLL) and add functions to QPro and this seemed possible using Delphi. Though I had't read in detail how to do so, and it seemed complex anyway, the potential was there. Hence the parallel developments in Delphi and QPro were not considered wasteful as they were expected to merge at some point in the future.
However, whilst wandering around book stores during my lunch break whilst working on contract in the city, I bumped into a book on Programming Office 97 using Visual Basic for Applications (vba). I had read some articles in computer magazines about Excel/vba but wasn't sure how vba related to Excel, plus I had a bias towards Borland software. Still I bought the book.
I had been hoping that Borland would merge the IDE's for Turbo Pascal, Turbo C and Turbo Basic and ensure they had the same function libraries, had built in language converters, including converters to turn GWBASIC into simple QPro spreadsheets or Paradox applications, and further more would be the programming languages for Paradox and QPro. They kind of did this, they threw Paradox and QPro away and developed Delphi and C++ Builder. Corel QPro I didn't like it was buggy. However, it was our 2nd QPro for windows license, and time was being wasted modifying my Borland spreadsheets to work in the Corel version. I didn't want to solve that problem by getting an additional Corel license. I was looking for an alternative spreadsheet, after reading the book on vba and office 97, I went and bought Office 97 Professional and the systems developer kit (SDK).
The wind loading functions I had programmed in Delphi, I translated into vba, thus making them available as functions in the Excel worksheet. A simple change to a spreadsheet cell and all the wind calculations upto date and the frame analysis using Kleinlogel also completed. Manually changing parameters in the spreadsheet I could quickly map out the potential of all the available c-sections for use in cold-formed steel sheds.
But still had a problem. Translating the dialogue boxes from QPro to Excel 97 wasn't so easy. Connecting the dialogue boxes to cells in Excel seemed cumbersome compared to QPro, I may have been doing it wrong but I had no real references for such. I tried the QPro way and that didn't seem to work: a similar approach does work in Excel 2003. Though there is still an issue of being able to abandon the dialogue box and not automatically updating the worksheet: such was not a problem with QPro1. Besides it appearing cumbersome to allocate data to drop down lists on the dialogue boxes, there was another problem with Excel and that was timing. There seemed to be a timing problem between getting data from dialogue boxes, evaluating user defined functions (UDF) and updating worksheet calculations. Either crashing or simply not calculating the correct answers.
Initially I had tried to replicate the QPro like wind loading macro's making use of the worksheets to store the data tables, but that appeared to be part of the timing problem and therefore I decided to abandon that approach in favour of using arrays in vba. Due to the problems with dialogue boxes, I abandoned them in favour of setting up input forms fully in the worksheet. Once the scope and life times of vba variables were better understood the workbooks worked fine.
But due to the problems encountered with programming vba, development continued in parallel in Delphi. I did attempt to iteratively change the structure height in an Excel worksheet and capture the resultant maximum frame moment and tabulate them. But a clear timing problem in that situation: the height can be changed faster than the worksheet can update the calculations. Incorporating delays could have probably fixed it, but why incorporate delays when objective to get calculated information as quickly as possible. Hence the Delphi program was expanded to iterate through the heights, or through the spans or through both heights and spans and calculate a table of maximum moments.
I then decided to produce a height/span chart and charting in Excel seemed easier than using Delphi graphics. Using Excel I could produce and store tables and charts and any other reporting may want, further more I could control Excel from Delphi. Unfortunately the Excel type library didn't import properly into Delphi due to keyword conflicts. The consequence was that programming Excel from Delphi was being done blind. Bit of a problem as Delphi uses [] for arrays and () for function parameters whilst vba uses () for everything. So that part of the application also needed parallel developments in Excel and Delphi, test what I needed to do in Excel then translate to Delphi.
This however was interrupted by changes to the wind loading code (AS1170.2), and since developing the application was a side line to day to day structural design, it was more important to update the general purpose wind loading Excel workbooks rather than update wind loading in Delphi. As a consequence Delphi was abandoned for developing the height/span charts, and it was all written in Excel/vba, all calculations in vba and thus avoiding problems of timing with worksheet calculation up dates.
Since we had been going down the path of developing a stand alone application in Delphi, pframe was part converted to Delphi to create a windows application with the graphics for the moment diagrams added, but without the rest of the interface developed.
However due to all the member design (AS4600/AS4100/AS1720) all being written in Excel/vba only, and the wind loading up date issue, it was considered that move over to development in Visual Basic might be more productive. So we obtained Visual Studio (VS) 2003 and VB.net, and then to get a second license we ended up with VS 2005.
But spreadsheets are still easier to format reports in, than messing around programming Delphi or VB.net. Sure those who prefer MathCAD type presentations think otherwise: that Excel is poor for presentation. For some reason there was some resistance to pushing forward with Delphi or VB.net development because of resistance to plain ASCII text files (.txt) or rich text files (.rtf), and complications of print preview and getting things to printers. But none of that is really a problem with MS Windows as notepad(.txt) and wordpad(.rtf) are available on each machine. Sure there is a possibility that the user can modify the results: but they would have difficulty proving and replicating such contended error in the program. Further today results are typically sent to pdf files rather than paper print out: and the pdf files can be edited.
Which is another point we had started to trial, generating pdf files, and produce all electronic documents early in the 1990's, but it was cumbersome to use pdf files where we needed results for further calculations. It was far easier to use paper printouts and mark up the required data. Scanning handwritten calculations to incoporate with computer generated calculations also produced massive pdf files, and so electronic documents were abandoned, we didn't have large enough hard disks to store the stuff, and zip disks were expensive. Hence further reason to integrate all the calculations electronically: eliminate the paper print outs for reference and produce smaller pdf files.
VB.net turned out to be significantly different from vba, and therefore it was put aside and pframe was converted to Excel/vba (xlFrame), so that it could interact more directly with Excel worksheet for input data and reporting.
Not long after doing that we were approached to provide a structures module for a carport/verandah spreadsheet. Whilst the structures module is relatively small the spreadsheet itself is relatively large, much of it is data and could probably be better done using MS Access: which is another development track pursued along with Paradox with respect to the materials management.
Now the Delphi application besides using Kleinlogel formula to generate height/span moment tables, it also generates data files for pframe, MicroStran (.arc) as well as AutoCAD LT scripts (.scr), it can also read data files from various other programs written. Much of this has been converted over to Excel/vba and extended further but as separate Excel workbooks. Attempting to gather a lot more vba code together into a single integrated applicataion hit some limit of Excel/vba. Whilst the code seems to run ok, its not possible to work on the modules as attempt to close/save the file it hits a memory limit and basically crashes. It won't save the file except through its recovery feature with all the vba code removed.
Since I don't consider I should use a better computer with more memory, nor that I should reduce the number of vba modules, further development in vba has stalled. Leading me to revisit VB.net.
The expectation was that could simply change the xlFrame into a COM automation object or .net component or similar, which can be plugged into Excel. Then all the parallel developments would disappear as all my wind loading and member design function libraries and the plane frame analysis could all be in vb.net and possibly a single library. Unfortunately that prior problem of the differences between vb.net and vba makes such conversion difficult for the plane frame analysis: though a simple conversion for the function libraries.
Also I want software like the carport/verandah software to export data files compatible with pframe and also to generate MicroStran arc files. When testing the data files exported by xlFrame they were not compatible with the 1996 version of pframe. This led me back to looking for Turbo Pascal source code to compile the original 1996 version of the program, and trace why the new data files were not compatible. Finding source code which compiled and used the same data files as the operational exe file, and which also produced correct results wasn't so easy.
The change in the file format was attributed to a change in how partial loads were defined. The error in the calculations was tracked down to dynamically allocated variables being freed from memory before results stored in those variables was actually used.
So having gone back to Turbo Pascal and given I prefer object pascal compared to vba, especially with respect to arrays in classes, it does seem that further development in Pascal may be the better option than vb.net, with Lazarus being a viable alternative to Delphi. Though a COM automation server may not be so easy to develop in Lazarus as it is in Visual Studio.
Anycase at the moment maintaining parallel developments in Turbo Pascal, Delphi 3, Lazarus, vb.net (VS 2005) and vba as I convert the record data structures into class/objects. The main difference at the moment is that an array has been converted into a collection in vba, though I may convert that back into an array and write property functions to access it. Both these vba approaches seem cumbersome compared to the other languages.
Plane Frame Analysis: Alternative Front End
Created an alternative Front-End for Plane Frame analysis. Instead of a single worksheet with all the data required by cpframe the data has been split between multiple worksheets. This makes it easier to add extra data records for each data set. This also makes it simpler to read an existing data file into the workbook, which may be useful if an auto-generated data file doesn't appear to be producing the correct results or if cpframe cannot read the file.
As primary interest is auto-generation of the models using vba and other programming languages rather than building in the worksheet, the next front-end I will release will read the data into appropriate vba data structures, with facility to save the data to the worksheet or retrieve from the worksheet. Similarly I will write a back-end based on similar data structures used by cpframe to write the results in the first place.
As I convert cpframe to a Windows console application, I will also add an option to read and write directly to MS Excel. {NB: Currently cpframe is MS DOS application and only supports 8.3 file conventions. In converting to MS Windows the intention is that it stays a command line console application.}
The file for the alternative front-end is:
frontEndPFrame02.xls
DISCLAIMER :
Users of the software must accept this disclaimer of warranty :
The software is supplied as is. The author disclaims all warranties, expressed or implied, including without limitation, the warranties of merchantability and of fitness for any purpose. The author assumes no liability for damages, direct or consequential, which may result from the use of the software.
Revisions:
[04/02/2014] Original
As primary interest is auto-generation of the models using vba and other programming languages rather than building in the worksheet, the next front-end I will release will read the data into appropriate vba data structures, with facility to save the data to the worksheet or retrieve from the worksheet. Similarly I will write a back-end based on similar data structures used by cpframe to write the results in the first place.
As I convert cpframe to a Windows console application, I will also add an option to read and write directly to MS Excel. {NB: Currently cpframe is MS DOS application and only supports 8.3 file conventions. In converting to MS Windows the intention is that it stays a command line console application.}
The file for the alternative front-end is:
frontEndPFrame02.xls
DISCLAIMER :
Users of the software must accept this disclaimer of warranty :
The software is supplied as is. The author disclaims all warranties, expressed or implied, including without limitation, the warranties of merchantability and of fitness for any purpose. The author assumes no liability for damages, direct or consequential, which may result from the use of the software.
Revisions:
[04/02/2014] Original
[23/04/2016] Changed download links to MiScion Pty Ltd Web Store
Plane Frame Analysis : The Back End
As mentioned in discussing the Front-End, the development of the back-end for frame analysis is of secondary importance, as it has been relatively well catered for by the off-the-shelf structural analysis software. For example software like MicroStran and Multiframe have two options concerning sizing of structural members, and these are:
The "Check" option carries out an assessment as to whether the currently set structural sections are adequate for the action-effects generated by the applied loads, and otherwise gives some indication of the efficiency of such sections. The designer can then adjust some sections in an attempt to get more efficient sections, and run the analysis again and check adequacy. The designer can repeat this until they are happy with the result.
The "Design" option automatically finds the most efficient section for each member, the designer can then opt to use these sections or not. Typically adopting the results of using the "design" option is highly impractical. Consequently the results of the "design" option are just used as a guideline for manually changing some of the members but not all of them. Therefore some human interaction is required to reach the final design selection.
Additionally little of the available software has integrated options for connection design and footing design, this is typically all done external to the program. Also as previously mentioned the 3D geometric model is not necessarily a valid structural model, and therefore there are other components designed external to the analysis software. Some 3D structural analysis software explicitly optimised for buildings, allows components to be modelled graphically but excludes them from the 3D structural analysis and treats the components separately in the manner most appropriate for such components. Thus allowing everything to be modelled visually but with out creating an invalid structural model.
For manufactured structural products (MSP's) typically expect:
For example cold-formed steel sheds and carports made from c-sections and employing rigid portal frames, typically have such frames at 3m centres. The frame spacing is thus one parameter which is rarely changed and when it is it is usually reduced. This is because the C75 typically used for girts and purlins is barely capable of spanning 3m: however it wastes little floor space compared to larger sections fastened to the face of columns. The roof pitch is also typically locked. These things often need to be varied but are not in the scope of standard calculations typically held by suppliers, hence the desire for software to allow such variation.
With respect to a true MSP, there is no real need for structural analysis software. Often the issue of analysis versus lookup tables arises, with look up tables being considered inefficient. It is incorrect to conclude that look up tables or databases are absolutely inefficient for the task. In fact there is a good chance that structural analysis is the wasteful inefficient option.
If have something like the span tables of the timber framing code (AS1684) or the girt and purlin load capacity tables for c/z-sections: then for certain automating looking up values from such tables is likely to be inefficient if it is based on simply building a database containing the printed tables. Similarly it would be inefficient to place the span tables for steel carports and verandahs into a database. If look at cold-formed steel sheds then the manufacturers typically have an ad hoc random collection of standard calculations with no rationale behind them: the calculations are of little use to anyone other than for annoying the customers with a failure to have anything compatible with their needs.
Rather than a database of values, the real need is for a database of constraints which can be readily attached to the input boxes on data collection forms. The MSP's are meant to be pre-engineered, thus all the engineering is expected to be done already. The engineering can therefore be used to define constraints and associated solutions and the software can therefore run a lot faster. In other words instead of searching through all the available structural sections to find one that works, from the very start already know what the minimum suitable section is. We know the capabilities of the smallest available section, and also know the limitations of the largest available section. So it is not necessary to carry out structural analysis at the PoS to identify that a proposed building is beyond the scope of the typical cold-formed steel structure, and requires significant custom engineering to make it feasible using cold-formed steel. {eg. We recently designed 32m span x 8m high using Fielders largest Dimond sections. The section is not adequate in it's own right and therefore had to be reinforced with steel flat bar and folded channel. Therefore feasible but not something would get the salesperson to do whilst chatting with the customer at PoS. This is not a Fielders shed, it is just using their materials.}
The database doesn't need to be massive. Further if talking about large databases, then the structural drawings and structural model especially if in 3D represent an extremely large database. Whilst the analysis of a 3D structural model is typically very fast, the automatic sizing of the members by the software can be painfully slow. The earlier versions of MultiFrame for example were extremely slow compared to MicroStran when it came to running the "design" option: now they are about the same, with MultiFrame having got faster. This I expect had more to do with MultiFrames complex multi window user interface than with the algorithm operating behind the scenes. So opting for analysis does not reduce the size of the database, nor does opting for look up tables increase the size of the database. The structural product needs to be looked at carefully, and if it hasn't really been designed that's going to be difficult.
For example each available cold-formed c-section has a maximum achievable envelope which it is suitable for when used for a simple gable frame. Once that section has been selected for a proposed building the connection components and footing system are also largely determined. Therefore only really need to know what the defining envelope is for each c-section. A simple data input form can then automatically update based on constraints in response to simple inputs. Depending on the structural product could all be done by an extremely simple and small Excel spreadsheet.
However all the engineering for the product needs doing first before any such constraints are available and the building industry is not really into being proactive and designing a product to satisfy the needs of a market, it is instead highly reactive only responding when it bumps into and trips over the customers needs. On the other hand if they did decide to be proactive and went to a consulting civil/structural engineer to get a MSP designed they would bump into a series of problems: that's why the manufacturers typically hold a random collection of structural calculations obtained on an as needs basis. An infinite number of points along a line segment of any length, leads to an infinite number of standard designs being required, which is not practical therefore seek software so that parameters can be varied on an as needs basis. Most manufacturers however are too small to pay for development of such software, and also seemingly too small to pay for product development.
My view however is that they could pay for product development if they employed engineering associates on staff and made use of off-the-shelf software. They develop the product in small steps and otherwise provide higher quality service to their customers, by having engineering capability on staff rather than hoping some external consultant is available at the time required.
If focus on product development and having a product available which meets the needs of the customer, then the PoS software can be kept simple and all the design and engineering done in the back-room prior to customer enquiry. The real objective is to predict accurately, what the customer wants, and have it available already, not ask them what they want, and supply at some future date.
Therefore the back-end of frame analysis is of secondary importance as there is now a diverse range of structural analysis software available which can be used for sizing members. Where little effort has been put is auto-generating the structural model: with geometry and loading. This is because the focus for high end software is dealing with geometry which comes from an architect and having to transform this into a structural model.
For MSP's we are only concerned with the structure, therefore more able to generate geometry and loading. The importance of this is that at point of generation we know that a certain structural element is in the roof or the wall and therefore know what loads to apply to it automatically.
For the architecturally defined geometry, do not know that a beam is in the roof, unless it carries additional data which can be interrogated, so that can apply the correct load to it. CAD operators find putting lines on the appropriate layers cumbersome, and commands designed to ensure that entities have the appropriate layer and other attributes even more cumbersome. So the possibility that all elements in a building information model (BIM) are all tagged correctly to allow automation tools to work correctly is relatively low.
For a MSP however everything is supposed to be predefined, and therefore we have far greater potential to auto-generate the structural model. If we can do that then there is plenty of software available for what I have labelled the back-end of frame analysis. Developing a back-end is therfore not something I wish to give priority to, as all this other software provides the needed independent check on the design of the structural product. I have MicroStran and Multiframe licenses explicitly for the purpose of checking one against the other. Most of the time only use one package, but when strange things occur then build models in both packages and check one against the other and hunt down causes of variation if any.
With an auto-generated structural model and a large variety of software available to carry out the frame analysis and size the members, there is reduced potential to question the validity of a manufacturers MSP, as there is potential for a large number of independent checks. The structural model is not hidden in some obscure software owned by a manufacturer. Further the suppliers of the general purpose frame analysis software will be under increasing pressure to further develop the back-end capabilities of their software as their software will be the ultimate bottleneck in the whole process. So why expend effort re-inventing the wheel? These software developers already have 80% or more of what is required for the back-end of frame analysis, let them add the missing features.
The current major bottleneck is building the model for use in the available software when it comes to the common structural forms of the MSP's. However some manufacturers may be better served by a stand alone structural analysis package with integrated back-end highly customised to a specific MSP.
Therefore to provide for experimentation with the back-end of frame analysis I have thrown together a simple MS Excel template. The template just has a single button which reads the results file generated by cpframe and writes them into the cells of a single worksheet. Once the results are inside a worksheet, they can be linked to other cells which are used for the sizing of members, the checking of connections and the sizing of footings. To deal with multiple load cases however it would be better to read the results into an array and process all the load cases using vba. It would generally be preferable to avoid performing calculations in a worksheet unless a relatively tidy presentation can be developed: as the number of structural members and load cases increases, such becomes increasingly impractical.
I see both the front-end and back-end being developed entirely in vba or other programming language. Whilst it is possible to do the calculations in the worksheet it just becomes increasingly prone to error and a nightmare to manage. Why repeat a calculation in 10 cells by copying one cell, when can write the formula once and place in a loop and be sure all 10 calculated results are based on the same formula at all times. Copying cells is prone to unexpected changes in cell references. Such changes may be easy to spot after the fact, but not always fresh in the users mind when they are copying the cells.
Worksheet calculations are useful for checking the operation of vba code and otherwise testing and breaking vba functions by attempting to supply invalid data. For example testing a function for the case of division by zero, has it been covered? What other inputs can break the function? All easier to test grabbing input parameters from the worksheet. Whether actually does work when called from vba is another matter, as the features available to handle errors in a worksheet cell are not valid when executing solely with in vba.
Anycase the back-end template is:
backEndPFrame.xls
DISCLAIMER :
Users of the software must accept this disclaimer of warranty :
The software is supplied as is. The author disclaims all warranties, expressed or implied, including without limitation, the warranties of merchantability and of fitness for any purpose. The author assumes no liability for damages, direct or consequential, which may result from the use of the software.
Revisions:
[04/02/2014] Original
- Check
- Design
The "Check" option carries out an assessment as to whether the currently set structural sections are adequate for the action-effects generated by the applied loads, and otherwise gives some indication of the efficiency of such sections. The designer can then adjust some sections in an attempt to get more efficient sections, and run the analysis again and check adequacy. The designer can repeat this until they are happy with the result.
The "Design" option automatically finds the most efficient section for each member, the designer can then opt to use these sections or not. Typically adopting the results of using the "design" option is highly impractical. Consequently the results of the "design" option are just used as a guideline for manually changing some of the members but not all of them. Therefore some human interaction is required to reach the final design selection.
Additionally little of the available software has integrated options for connection design and footing design, this is typically all done external to the program. Also as previously mentioned the 3D geometric model is not necessarily a valid structural model, and therefore there are other components designed external to the analysis software. Some 3D structural analysis software explicitly optimised for buildings, allows components to be modelled graphically but excludes them from the 3D structural analysis and treats the components separately in the manner most appropriate for such components. Thus allowing everything to be modelled visually but with out creating an invalid structural model.
For manufactured structural products (MSP's) typically expect:
- Reduced parameter set
- Reduced set of components
For example cold-formed steel sheds and carports made from c-sections and employing rigid portal frames, typically have such frames at 3m centres. The frame spacing is thus one parameter which is rarely changed and when it is it is usually reduced. This is because the C75 typically used for girts and purlins is barely capable of spanning 3m: however it wastes little floor space compared to larger sections fastened to the face of columns. The roof pitch is also typically locked. These things often need to be varied but are not in the scope of standard calculations typically held by suppliers, hence the desire for software to allow such variation.
With respect to a true MSP, there is no real need for structural analysis software. Often the issue of analysis versus lookup tables arises, with look up tables being considered inefficient. It is incorrect to conclude that look up tables or databases are absolutely inefficient for the task. In fact there is a good chance that structural analysis is the wasteful inefficient option.
If have something like the span tables of the timber framing code (AS1684) or the girt and purlin load capacity tables for c/z-sections: then for certain automating looking up values from such tables is likely to be inefficient if it is based on simply building a database containing the printed tables. Similarly it would be inefficient to place the span tables for steel carports and verandahs into a database. If look at cold-formed steel sheds then the manufacturers typically have an ad hoc random collection of standard calculations with no rationale behind them: the calculations are of little use to anyone other than for annoying the customers with a failure to have anything compatible with their needs.
Rather than a database of values, the real need is for a database of constraints which can be readily attached to the input boxes on data collection forms. The MSP's are meant to be pre-engineered, thus all the engineering is expected to be done already. The engineering can therefore be used to define constraints and associated solutions and the software can therefore run a lot faster. In other words instead of searching through all the available structural sections to find one that works, from the very start already know what the minimum suitable section is. We know the capabilities of the smallest available section, and also know the limitations of the largest available section. So it is not necessary to carry out structural analysis at the PoS to identify that a proposed building is beyond the scope of the typical cold-formed steel structure, and requires significant custom engineering to make it feasible using cold-formed steel. {eg. We recently designed 32m span x 8m high using Fielders largest Dimond sections. The section is not adequate in it's own right and therefore had to be reinforced with steel flat bar and folded channel. Therefore feasible but not something would get the salesperson to do whilst chatting with the customer at PoS. This is not a Fielders shed, it is just using their materials.}
The database doesn't need to be massive. Further if talking about large databases, then the structural drawings and structural model especially if in 3D represent an extremely large database. Whilst the analysis of a 3D structural model is typically very fast, the automatic sizing of the members by the software can be painfully slow. The earlier versions of MultiFrame for example were extremely slow compared to MicroStran when it came to running the "design" option: now they are about the same, with MultiFrame having got faster. This I expect had more to do with MultiFrames complex multi window user interface than with the algorithm operating behind the scenes. So opting for analysis does not reduce the size of the database, nor does opting for look up tables increase the size of the database. The structural product needs to be looked at carefully, and if it hasn't really been designed that's going to be difficult.
For example each available cold-formed c-section has a maximum achievable envelope which it is suitable for when used for a simple gable frame. Once that section has been selected for a proposed building the connection components and footing system are also largely determined. Therefore only really need to know what the defining envelope is for each c-section. A simple data input form can then automatically update based on constraints in response to simple inputs. Depending on the structural product could all be done by an extremely simple and small Excel spreadsheet.
However all the engineering for the product needs doing first before any such constraints are available and the building industry is not really into being proactive and designing a product to satisfy the needs of a market, it is instead highly reactive only responding when it bumps into and trips over the customers needs. On the other hand if they did decide to be proactive and went to a consulting civil/structural engineer to get a MSP designed they would bump into a series of problems: that's why the manufacturers typically hold a random collection of structural calculations obtained on an as needs basis. An infinite number of points along a line segment of any length, leads to an infinite number of standard designs being required, which is not practical therefore seek software so that parameters can be varied on an as needs basis. Most manufacturers however are too small to pay for development of such software, and also seemingly too small to pay for product development.
My view however is that they could pay for product development if they employed engineering associates on staff and made use of off-the-shelf software. They develop the product in small steps and otherwise provide higher quality service to their customers, by having engineering capability on staff rather than hoping some external consultant is available at the time required.
If focus on product development and having a product available which meets the needs of the customer, then the PoS software can be kept simple and all the design and engineering done in the back-room prior to customer enquiry. The real objective is to predict accurately, what the customer wants, and have it available already, not ask them what they want, and supply at some future date.
Therefore the back-end of frame analysis is of secondary importance as there is now a diverse range of structural analysis software available which can be used for sizing members. Where little effort has been put is auto-generating the structural model: with geometry and loading. This is because the focus for high end software is dealing with geometry which comes from an architect and having to transform this into a structural model.
For MSP's we are only concerned with the structure, therefore more able to generate geometry and loading. The importance of this is that at point of generation we know that a certain structural element is in the roof or the wall and therefore know what loads to apply to it automatically.
For the architecturally defined geometry, do not know that a beam is in the roof, unless it carries additional data which can be interrogated, so that can apply the correct load to it. CAD operators find putting lines on the appropriate layers cumbersome, and commands designed to ensure that entities have the appropriate layer and other attributes even more cumbersome. So the possibility that all elements in a building information model (BIM) are all tagged correctly to allow automation tools to work correctly is relatively low.
For a MSP however everything is supposed to be predefined, and therefore we have far greater potential to auto-generate the structural model. If we can do that then there is plenty of software available for what I have labelled the back-end of frame analysis. Developing a back-end is therfore not something I wish to give priority to, as all this other software provides the needed independent check on the design of the structural product. I have MicroStran and Multiframe licenses explicitly for the purpose of checking one against the other. Most of the time only use one package, but when strange things occur then build models in both packages and check one against the other and hunt down causes of variation if any.
With an auto-generated structural model and a large variety of software available to carry out the frame analysis and size the members, there is reduced potential to question the validity of a manufacturers MSP, as there is potential for a large number of independent checks. The structural model is not hidden in some obscure software owned by a manufacturer. Further the suppliers of the general purpose frame analysis software will be under increasing pressure to further develop the back-end capabilities of their software as their software will be the ultimate bottleneck in the whole process. So why expend effort re-inventing the wheel? These software developers already have 80% or more of what is required for the back-end of frame analysis, let them add the missing features.
The current major bottleneck is building the model for use in the available software when it comes to the common structural forms of the MSP's. However some manufacturers may be better served by a stand alone structural analysis package with integrated back-end highly customised to a specific MSP.
Simple Back-End |
I see both the front-end and back-end being developed entirely in vba or other programming language. Whilst it is possible to do the calculations in the worksheet it just becomes increasingly prone to error and a nightmare to manage. Why repeat a calculation in 10 cells by copying one cell, when can write the formula once and place in a loop and be sure all 10 calculated results are based on the same formula at all times. Copying cells is prone to unexpected changes in cell references. Such changes may be easy to spot after the fact, but not always fresh in the users mind when they are copying the cells.
Worksheet calculations are useful for checking the operation of vba code and otherwise testing and breaking vba functions by attempting to supply invalid data. For example testing a function for the case of division by zero, has it been covered? What other inputs can break the function? All easier to test grabbing input parameters from the worksheet. Whether actually does work when called from vba is another matter, as the features available to handle errors in a worksheet cell are not valid when executing solely with in vba.
Anycase the back-end template is:
backEndPFrame.xls
DISCLAIMER :
Users of the software must accept this disclaimer of warranty :
The software is supplied as is. The author disclaims all warranties, expressed or implied, including without limitation, the warranties of merchantability and of fitness for any purpose. The author assumes no liability for damages, direct or consequential, which may result from the use of the software.
Revisions:
[04/02/2014] Original
[23/04/2016] Changed download links to MiScion Pty Ltd Web Store
Monday, February 03, 2014
Plane Frame Analysis: The Front End
The Plane Frame analysis command line application previously released, is typically expected to be used for developing and testing a larger application which would comprise of the following:
Download frontEndPFrame01.xls
For a more flexible approach I will release an alternative template, with the data spread across multiple worksheets, making it easier to shrink and expand the structural models, and provide easily reading existing data files into the Excel workbook. Generating multiple models in the one workbook is not compatible with worksheet calculations. Multiple models is a definite move towards using vba or other programming language.
DISCLAIMER :
Users of the software must accept this disclaimer of warranty :
The software is supplied as is. The author disclaims all warranties, expressed or implied, including without limitation, the warranties of merchantability and of fitness for any purpose. The author assumes no liability for damages, direct or consequential, which may result from the use of the software.
Revisions:
[03/02/2014] Original
[23/04/2016] Changed download links to MiScion Pty Ltd Web Store
- Front-End which auto-generates the structural model (the data file for cpframe).
- cpframe
- Back-End which obtains the results from cpframe and then uses to size members, check connections and size footings
My primary focus is on the front-end as there is plenty of structural analysis software out there which already integrates the structural analysis function of cpframe with the back-end functions of member sizing, though little software available for checking connections and sizing footings with in the structural analysis software. The available structural analysis software also typically has facility for auto-generation of the dimension and geometry for common structural forms. Very little of the software however will auto-generate the model with loads applied: and that which is available doesn't cover Australian standards.
With respect to manufactured structural products (MSP's) there are at least 3 issues for structural design software, and these are:
- Rapidly analyse and specify structural requirements and generate cost for at the point-of-sale (PoS) and to be used by sales people and customer.
- Provide adequate documentation for seeking development approval (building permits)
- Aid independent checking and/or testing
The latter issue is important because without adequate consideration it becomes necessary to generate an unnecessary amount of documentation for seeking development approval for each and every project. MSP's are made in large numbers, if the structure was adequate last week, and barring any major changes in codes of practice, then it will be adequate this week and next week. To churn out documentation in the form of a thick report for each and every project becomes silly. The report needs to be kept, as far as possible, to a single page certificate: but we need independent means of checking the validity of such certificate.
Since most software for rapid design of MSP's is restricted to use by the manufacturers and their sales agents, no consulting engineer charged by local councils to do an independent check can rapidly generate a design model for use in general purpose structural software. Checks would involve the following:
- Relevance of the software or structural model to the proposed building structure
- Validity of the input parameters to the model
- Validity of the calculated results
Similar checks are also required by consulting engineers employed to check and certify the software. However whilst such software may require to be certified, such certification doesn't mean the software is relevant to the projects it actually gets used for in its day to day usage. The authors of such software also need to be conducting such checks and tests.
To my mind it has never been a simple matter to just dive straight in and start writing software for a manufacturer. The first stage should be developing tools which assist with the independent testing through the use of general purpose structural analysis software. To start with the engineers typically asked to provide such software usually start out by providing the engineering, so they need tools to provide rapid feedback to their clients. {NB: This not the case of designing a building for an architect and come back in a week, the feedback needs to be in around 30 minutes. In 24 hours the customer could have gone else where got a price for a building and placed an order.}
Unfortunately the manufacturers typically want to start putting software to use immediately, and once they do they will start hitting into projects which are beyond the scope of the software. The engineer will then have to deal with these variations in rapid time frames. So this situation reinforces the need to have an interface to general purpose structural analysis tools, so that the custom variations to the MSP can be easily handled externally to the manufacturers software.
Now given that MicroStran is popular general purpose structural analysis software, and its text based '.arc' files are a common import option with other software, then auto-generation of MicroStran '.arc' would the more productive option to adopt for the design office.
However developing a front-end to auto-generate structural models for MicroStran is not useful with respect to developing the software required by the manufacturer of MSP's, as MicroStran is not available as an analysis engine for integration into other software.
Therefore need to pursue parallel paths:
- Auto-generating models for a structures engine
- Auto-generate a model compatible with general purpose structures software.
Since we have MicroStran and MultiFrame licenses I will be developing auto-generation of models for these two packages along with models for our structures engine (cpframe). In the first instance I will focus on models for use with cpframe, as engineering graduates have the greatest potential for writing model generators and they don't necessarily have access to the commercial software packages.
It is also to be noted that most consulting engineers have little interest in auto-generation of structural models, as they mostly work with architects and have to fit a structure to a building, where as with MSP's the building is fitted to the structure. So most consulting engineers will be increasingly moving towards use of building information models (BIM) as such becomes more affordable and practical. However BIM forces the use of 3D models, and 3D structural models introduce a multitude of problems with respect to realistic modelling of the real structure. There are components in building structures which cannot be modelled correctly in general purpose 3D structural analysis software. If these components are left out, say girts and purlins for example, then no longer have a single 3D structure but a series of isolated plane frames. These 2D plane frames are largely the same, therefore wouldn't waste time modelling all of them, just the most heavily loaded frame and make them all the same. This is important for PoS software, as the computer has to wade through all available components and find the ones which work: where as an experienced designer would start off with a reasonably good guess and would only need a few iterations to find the best solution from the available components. In short at present BIM is too expensive, and the 3D graphical model has little relationship to engineering models across all disciplines, and a lot of extra work is imposed for no benefit. Any case for these engineers auto-generation of the structural model is of little value, as the primary requirement would be to auto-generate design actions applied to the dimensional and geometric model created by the architect. For such situation, idealistically want the structural model, the design actions applied, as the architect designs the building. For example the architect inserts a floor, defines its purpose and the floor structure is automatically generated. The structural engineers task being to tweak the model and advise the architect of changes to make. Ultimately the architect should be able to remove a column and the beam supported would turn red identifying as no longer suitable and in need of re-sizing. That is 80% of the expert advise the architect needs would come from the software the remaining 20% from specialist consultants. It is to be noted that often architects have difficulty finding engineers who are capable of realising the proposed the building, and so the buildings get reduced to the pinned and braced boxes which are in the capabilities of the available engineers. {NB: Whilst an engineer may have studied some theory or be able to pick up and read literature on new theory or rarely used theory, it doesn't mean they are confident or willing to apply such theory in the real world. So in the first instance computer software allows those who can to enable those of lack the confidence to go it alone. Such software however provides a foundation for enabling and empowering everyone, instead of building physical prototypes and testing, they use the software as the test bed.}
So with MSP's we are dealing with standardised structures and the proposed building has to fit the available MSP. The structure therefore takes precedence over the building design, the designer has to fit their building into the envelope provided by the MSP. The designer is typically the owner who has decided that an MSP is suitable for their purposes, and who otherwise wants to avoid the delays caused by architects and engineers. Unfortunately they typically miss the delays which will be caused by expecting a sales person to provide design for the custom features the buyer desires. An engineer being called in at the last minute when everything has been rejected by the local city council. Since the engineer can be any available engineer, such engineer first has to get up to speed with the manufacturers product. Which will turn into a hassle as the manufacturers don't really have product specifications, or spew forth a lot of nonsense about intellectual property (IP). If they had any real IP then they would have patents, and if they have patents, its public knowledge. In the main they have no real product, no product manager, and no product development manager, and an over willingness to sell what ever the customer asks for. Hence their desire for software to do the design to the customers requirements at point of sale: but do the design and engineering for what? That is something they have no idea about, except the unrealistic expectation: anything the customer asks for.
Clearly developing an auto-generator with the flexibility to generate a structural model for anything under the sun is a major undertaking if not impossible. To achieve that would require some form of scripting or modelling language to define a new structural form without going back to source code programming. There would still need to be development of a new interface for the user to incorporate the new MSP, it wouldn't be something done at PoS whilst talking to customer. {NB: MicroStran as a macro language for auto-generation of complex geometry: as I remember spirals and such. I never really used it, its limited, and its a variation of the .arc file, it just removes the need to define all the nodes and connectivity of such.}
So whilst there may be flexibility in the background behind the scenes, in the foreground there is necessary restraint on what the salesperson and customer can do. Otherwise could simply use BIM type software at PoS and feed into the back office for engineering at some other time after PoS. The whole point of MSP's is that such structures are as close as possible to being off-the-shelf buildings with a fixed price and comparable from one supplier to another. Too much variation and its no longer an MSP. A car is pre-engineered, you cannot go into the showroom and ask for it to be supplied with 3 wheels instead of the standard 4: such variation imposes a need for extensive engineering which includes building of prototypes and physically testing that mathematical models are valid and not overly unrealistic. Choosing whether to have a radio in a car or not doesn't typically impose a need for additional engineering. Putting a window in a shed doesn't impose a need for engineering, placing a door in a shed which is wider than the spacing of the frames imposes a need for engineering. Placing a building in a wind loading environment it wasn't originally designed for imposes a need for additional engineering. Most of this structural engineering is little more than calculations to codes of practice. Not all of the engineering however is within the scope of calculation and some physical testing is often required. Hence the importance of defining the product before writing the software.
Software is also a product and it should also be designed before being written. However it is recognised that the manufacturers just want to get on with manufacturing. You don't get the full technical drawings and specifications for a car, but then again the customer is not permitted to make changes to the car which would require such information. The building industry does permit making changes which requires the additional information, and therefore they need to make it available: which is extremely difficult if they haven't produced it.
There is a lot of up front work to do before software the likes of MicroStran or Multiframe can be put to use. Most MSP's are well within the capabilities of Australia's 2 year qualified Engineering Associates to design and assess, and such persons are likely more compatible with the needs of the majority of the manufacturers than professional engineers. So if the structural models for the MSP's can be auto-generated and be compatible with commercial structural analysis software, then the engineering associates can handle custom variations in house, a lot more rapidly than queuing up for assistance from a consulting engineer. If such personnel can be employed along with appropriate software then the need for PoS software would be slightly reduced, because going to the manufacturer would still be faster than going to an architect and then going out to tender.
So would say we are looking at:
- PoS design solutions in no more than 5 minutes during at most a 30 minute interview.
- Behind the scenes solutions in less than 24 hours: same day response.
The latter can be achieved by semi-automating the readily available general purpose structural analysis software. The former requires full automation at PoS, and requires highly specialised custom software.
In the first instance therefore I would recommend semi-automating the readily available software and employing the right people on staff. Such tools would enable all consultants and increase the number available who can deal with custom variations to MSP's, including future extensions, and who can test and certify the more specialised software, and otherwise independently check and certify individual building projects. Diving straight into the specialised software owned by the manufacturer and only available to their sales distributors just creates a lot of hassle for everyone else involved
Internally to an application which auto-generates a structural model it doesn't really matter what structural analysis software is used. That is the difference between generating a model for cpframe or MicroStran is simply a matter of file format: the data required for the files is the same. Therefore a model generator written for cpframe if written appropriately can be easily adapted to a multitude of general purpose frame analysis software by writing an appropriate file export procedure. The principal task of generating a model already completed in the main body of the application.
An MS Excel Template for cpframe.
The simplest, though not very practical, means of generating a model for cpframe is to use MS Excel and perform all the calculations in the worksheet and connect these to a summary sheet which as the appropriate format for writing to a cpframe data file. To such end I have thrown together a template for such purpose called frontEndPFrame01.xls.
frontEndPFrame01.xls |
As its a template file it doesn't actually do anything. The template merely provides a worksheet which can be adapted to suit various structures by adding extra data rows as required. Buttons are provided to write the contents of the worksheet to a data file compatible with cpframe, and then run a shell to execute cpframe to generate the results file. As the application is procedural, can be reasonably certain that the data file will exist before the next command is executed, therefore these tow buttons can be combined into a single command button: with the data file being automatically generated and passed straight to cpframe. On the other hand cannot be certain the results file is available, before the next command executes, as cpframe may still be in process. Therefore have to have a separate manually executed task for viewing the results. This command merely opens the results file using Windows notepad.
As a test my gable frame spreadsheet can be meshed into the template. This spreadsheet calculates the wind loads on a simple gable frame, the same structural form as set up in the front end template. So just need to link the appropriate cells, such as the wind loads on the frame, into the the template data sheet. Since the spreadsheet already contains Kleinlogel formula for the frame, the results of running cpframe can be checked. Since cpframe can only handle a single load case at a time, conditional formula would be required to switch the load case the file is being generated for. This approach I took in 1996 using Quattro Pro (QPro and the original version of pframe. {Actually back then the program was called either f_wrk.exe or frame.exe, pframe was the name of my QPro workbook which drove the program}.
QPro pframe.wb1 |
As can be seen from the screen capture of the QPro workbook, it contained buttons for collecting data defining the structure, writing the data file for the plane frame analysis program, running the plane frame analysis program, and then printing out the multitude of worksheets used to design all the various components of the whole building extending beyond the primary frame. To the right of the worksheet is a small table showing the load cases for the old permissible stress design, with a marker showing the current load case for which the plane frame data file is to be created.
Plane Frame Analysis Launched in front of QPro Workbook |
With the original version of the plane frame application, I had to manually open each data file and run the analysis inside the plane frame application. With cpframe those steps are removed. Still with the original application that is how I manually incremented the heights of the structures until I broke the section desired to be used and then produced standard calculations for maximum height structure possible. {Stepping back of course to what did work. I also happened to know which load case was most likely to be the critical load case therefore I only need to increment height for one load case, and check other load cases when it appeared the maximum height had been reached.}
The QPro application never got fully converted to MS Excel for a variety of reasons: basically all the building blocks are there, just not connected up. Any case using the worksheet to do the calculations is not the most efficient way to do things. Its intuitive and fast initially, but cumbersome at later date when need to add additional members and load cases. It is better to use the wind load functions in schTechLIB directly in vba code using arrays, rather than reference the functions to carry out calculations in worksheet cells. Calculations in the worksheet cells are fine for presenting the calculations but a hindrance to more complex tasks.
Download frontEndPFrame01.xls
For a more flexible approach I will release an alternative template, with the data spread across multiple worksheets, making it easier to shrink and expand the structural models, and provide easily reading existing data files into the Excel workbook. Generating multiple models in the one workbook is not compatible with worksheet calculations. Multiple models is a definite move towards using vba or other programming language.
DISCLAIMER :
Users of the software must accept this disclaimer of warranty :
The software is supplied as is. The author disclaims all warranties, expressed or implied, including without limitation, the warranties of merchantability and of fitness for any purpose. The author assumes no liability for damages, direct or consequential, which may result from the use of the software.
Revisions:
[03/02/2014] Original
[23/04/2016] Changed download links to MiScion Pty Ltd Web Store
Subscribe to:
Posts (Atom)