You are not logged in.
Personally I’m not fond of formal methodologies. Imho the whole systems engineering approach with its semi-iterations is a bit too much theoretical blabla.
Extreme programming (XP) has shown that in software engineering a less formal or lighter mythology results in much better results. Hardware engineering isn’t that different from software engineering. Both are dynamic and creative processes. Both get a ‘flow’ or ‘buzz’ going. My question is, should engineers go extreme?
With both feet on the ground you won't get far.
Offline
You are going to make me get out my software engineering book. lol. Is an EE student I only too one software engineering course but I will give an opinion sometime later. P.S. I forgot that is what the XP stood for in windows XP if I ever new.
Dig into the [url=http://child-civilization.blogspot.com/2006/12/political-grab-bag.html]political grab bag[/url] at [url=http://child-civilization.blogspot.com/]Child Civilization[/url]
Offline
They certainly shouldnt leave an avenue unexplored. Just because you think the world is flat doesnt mean there isnt a paradise just over the horizon.
If I hadn't gone and looked at the scientific advances of the ancient egyptians, I wouldn't have noticed that Gold was useful as an electrostatic collector. Nor would I have then looked at Mesons and Neutrinos as the Carrier of electrons. From there I looked at superconductivity and found that neutrinos, a field particle were decellerated across a superconducting boundary because they collide and are repelled by their own reflection. Certainly I wouldnt have considered that Neurtinos can be captured, confined and redirected to propel a small mass probe without the need to refuel.
If scientists and engineers dont look at the extreems they will miss out on all sorts of things.
Offline
Oke, maybe I should elaborate a bit further. My intention is to start a discussion about the engineering approaches or methedolgies. The systems engineering approach allows the organization of complex engineering projects. The systems engineering theory was developed during the Second World War as a result of higher reliability requirements and is still being developed today. Prior to WW2 projects where performed without a structured methodology.
During the Second World War engineering projects became increasingly complex and difficult to manage furthermore requirements became more stringent. To cope with this increased complexity the chief engineer started using a team of system engineers to develop the project. After WW2 the complexity of engineering projects increases rapidly. During this period it became clear that a systematic approach was required to cope with the increased complexity. An early method was the Program Evaluation and Review Technique (PERT) which helped during the planning. It was also found (partly trough blowing up rockets) that changes where needed in the engineering process. During this period concepts such as part traceability, change control and interface control where developed. The systems engineering process is now days defined as a iterative process of technical management, acquisition and supply, system design, product realization and technical evaluation. In theory during each step in the design process many alternatives should be evaluated and traded against each other, so that the 'best' solution will be found. A whole range of tools have been thought up to aid the engineering during these steps. One can think of functional and operational diagrams, requirements discovery trees, RAMS analysis etc etc.
In software development a similar development occurred. During the early days of software development programmers (mostly scientists) wrote their programs ad-hoc. This wasn't necessarily a bad thing since the programs where relative simple and at-hoc writing is the quickest way to write a program that does one single thing. However, software became more and more complex and it didn't take long before a more structured approach was required. With the development of procedural languages the GOTO statement was banned and programming transformed into software engineering. Soon it became clear that even procedural languages have limitations, especially concerning code clarity and reuse. Object oriented languages revolutionized the way software is being engineered, using easy to understand abstractions. However problems similar to those experienced in system engineering arose. To assure high quality software analysis, design and implementation where separated in phases. Tools such as concepts maps, domain models, use-cases, static/dynamic object models, dataflow diagrams, etc etc where created to crystal out the design. Several abstract methodology such as OMT, OBA where (and still are) developed to structure the modeling process.
During the 90's a growing group of software developers where getting increasingly annoyed by the fact that they spend far too much time writing al kinds documents instead of code. People started to get lost in the abstract process of modeling, and lost sight of the common goal. Following this Kent Beck developed a new lightweight and very practical methodology called Extreme Programming (XP). Kent published his ideas in the book Extreme Programming Explained in 2000. XP is approach is based on increasing productivity (coding) by providing practical guidelines. Two vital features of XP are pair programming and writing tests prior to the actual code. The result has been phenomenal. Software is developed more quickly, code is of higher quality and programming has become much more fun. XP also stimulates communication with the customer which prevents requirements mismatches.
One of the first steps when converting to XP is to rearrange the office layout so everybody can directly communicate with each other (no more cubicles). A strong aspect of XP is that it doesn't rely on theoretical iteration. The process is such that constant updating of the code is common practice. Because the structure in a XP team is less formal, communication is much quicker.
Now, the funny thing is that in the automotive and aerospace industry a similar development is happening. A new type of design practice called Concurrent Engineering (CE) is being developed by most large aerospace companies. During the CE design practice work is performed in parallel instead of series. When for example a designer has a first impression of a bonnet, a structural engineer takes this first model to start his work. He knows that the design will change and thus he will need to update his structural design. His tools are such that updates can be made in real-time. As soon as there is a update in the design or in the structure, both the designer and engineer get together, discuss the modifications and update/merge the design. The difference between ordinary, systems engineering is that all this happen naturally. All the domain experts (including manufacturing) are in the same room. Information Technology is an important requirement to enable CE. All team members need to work on the same design model. Furthermore the tools used by the domain experts must operate in near real-time.
So as one can see, there are parralels. Both seem to be developing to a less formal process with direct communication between team members and with the customer.
With both feet on the ground you won't get far.
Offline