Technical experts can also be successful in advisory sales

Consultative selling and communication skills trainer
Engineers and technical professionals are used to thinking of their advisory role as related to content alone. But if they want the customer or stakeholder to take action on the advice they offer, something else is needed: acceptance from the person for whom the advice is intended. That’s where sales skills come in. Claus Neeleman trains technical experts in successful advisory sales. ‘Once you understand how the sales process works, you can advise much more effectively. Both external and internal customers. Leading to a positive effect on your company’s results.’

Consultative selling, or advisory sales, is an effective sales method and therefore receives a lot of attention. According to trainer Claus Neeleman, this attention is justified. ‘Advisory sales is the best thing for the customer, it is about finding the best solution for that customer and matching it to your own interest, namely the margin on the products or services that you sell. Advising and selling are therefore both important. The trick is to create value for the customer. That value is in good advice that yields more than what the customer pays for it. In engineering companies, engineers have an important, supporting sales role, because they know exactly what matters in terms of technical content. To sell something in the high tech environment, it is the content that sells, not the sales talk.

'Advisory sales is the best thing for the customer, it is about finding the best solution for that customer and matching it to your own interest, namely the margin on the products or services that you sell.'


Claus Neeleman trains technical experts in successful advisory sales.

'The trick is to create value for the customer. That value is in good advice that yields more than what the customer pays for it.'

Neeleman has a friendly personality and an intelligent glance. He is qualified as an occupational and organisational psychologist. He has worked at an assessment agency, at a reintegration agency, and amongst other things as regional manager. ‘When you carry out an assessment, you analyse and test people, which I thought was super fun and still do. You find out how to see people’s qualities and pitfalls, with an aim to helping them improve. At the reintegration agency, that didn’t always help, because in that environment commerce plays a major role. This sometimes results in moral dilemmas. Do you help the person you have to put a lot of energy into, or the person who doesn’t cause much bother? I did this type of work mainly to help people move forward in their careers and their lives, so such choices were not what I wanted. That’s why I decided to become a trainer. Of course, I also took training courses myself and discovered that it is a fascinating field. Training is something positive. People improve after taking a training course, they like it and are enthusiastic afterwards. That gives me energy. And I find it more fun to talk and to be busy with people than to write reports at a desk.’

Lots of practice

Neeleman has been working as a trainer for some eighteen years. He focuses mainly on practical skills. ‘Many of the things that I tell you come from psychology together with insights from the field about how you can influence people and what the effects are. The content of a conversation can therefore be the same, but the strategy of transferring that content to another may differ. The best approach depends upon the situation and the people in question. I firmly believe that practice is the best way to learn how to sell, for example, in an advisory capacity. The theory behind it is not complicated at all, but to better address conversations with customers you first have to experience what it is like when you try out different behaviour.’

'To better address conversations with customers you first have to experience what it is like when you try out different behaviour.'

Teacher of the year

For several years now, Neeleman has been giving two training courses at High Tech Institute: Effective Communication Skills for engineers and Sales skills for engineers. In 2016, he was appointed trainer of the year by the High Tech Institute, with an evaluation score of 9.1 out of 10. Trainees called him impressive, inspiring and empathetic as a teacher and say that he is excellent at explaining things and tailors the course right to their needs.


In 2016, Claus became High Tech Institute’s ‘Teacher of the year’.

'To every advise moment, belongs a sales moment.'

That is quite special, because selling is not the favourite job of technology professionals…
‘Correct. They also often think that they only give advice. But that is incorrect. What is not seen is that they use less effective strategies in conversations with the customer. The result however is noticeable: as soon as the customer puts them under pressure, they already give a discount that is not in their interest. Or they are too customer-friendly and forget to make agreements about the remuneration for their consultancy work. Or they are unclear about the costs. During an on-going contract, a sales moment belongs to every advice moment. You have to pay attention to that.

But also, in the initial phase of contact with the client, a technician must be sufficiently convincing to make the sale of a service or product succeed. How do you ensure that you come across well and generate trust? How do you give the customer the idea that you are strong enough to carry out the project? You have to create trust and adapt your communication style to the customer and what is important to him/her. Both with regard to content and personal interaction. And if you work together with an account manager you have to learn to speak one another’s language, so that you know what your colleague’s intentions are and what the other person is doing in the sales process. The sales person must of course also know when the content is important.’

That sounds pretty difficult.
‘In reality it’s not such a big deal! The theory is a tool, a model that tells you which steps to take. Analytical people, such as technicians, can handle this very well. For example, the theory is that you yourself often generate resistance from the customer. This happens, for example, if you are more concerned with your own goals than with those of the customer. Or if you put too much pressure on them. That is what we call counter-behaviour. For example, if you constantly know things better than your customer, they will start to object. And if you are too dominant in the speed at which you talk about things or try to enforce a decision, this also provokes resistance. Counter-behaviour doesn’t help you sell your solution. But if you connect with your customer and go into a constructive dialogue, you will build things. The customer then moves on with you much more smoothly. If you encounter resistance during a conversation, you can adjust it by adjusting your behaviour. For example, by leaving more of the pace during the conversation to the customer and by clearly putting his/her interests first.’

Do you yourself have to change in order to sell better?
‘That’s not necessary at all. You just remain yourself, you only choose to exhibit different behaviour in certain situations in order to be more effective in your performance. If you are aware of the way a sales process progresses and you know what works, you can determine much more effectively what effect you want to have on others. It is not about right or wrong. You can reach your goal in many ways. But if you want to bring your story on stage successfully, it will certainly help if you know how to carry out advisory sales. And you can easily do that without forcing yourself into a situation that you don’t like.

'As soon as you understand the sales process, you can advise more effectively. '

What is the secret of a successful advisory sales conversation?
‘You need two ingredients: a good sound story and acceptance by the customer. The latter refers to ensuring that the customer can accept your advice. You do this by raising questions, feelings and doubts that could prevent the customer from accepting your product or service and giving good answers. Interviewing your customer based on the signals s/he gives you is not easy for technicians, because technicians deal mainly with facts and less with emotions. But, with a little practice, they can learn how to do this.’

How does such a conversation proceed?
‘The first phase is the contact phase. Technicians often find it difficult to get through this part and prefer to go straight to the content. But the first phase is important for generating trust and for creating a good personal relationship. In this phase, you also decide what you are talking about. You show that you have thought about the customer’s problem and you indicate that you already have a few ideas. In the contact phase you also agree on your way of communicating with the customer. If you have the same communication style, that’s easy. A customer can also be very directive and want to decide quickly. As a technician you have a tendency to look at a problem from all sides, but this type of customer gets irritated by that. So, if you find that time and money are important goals for a customer, then you have to respond to that information. You will then get more space for the content later in the conversation.

In the second phase you will make an inventory, thus mapping out the customer’s needs. You have proven effective methods for that. As a result, the customer recognises the scope of his/her problem and wants to take action. You cannot achieve that by saying that they have a big problem, you do that by asking questions. This leads to a sense of urgency, the idea that something has to be done.

The third phase is the presentation of your advice, where you show your skills and influence people. In the fourth and final phase you help the customer to come to a decision by taking steps together in the decision process. This is the actual advice work.

All in all, an advisory sales conversation is more about the customer than about you. The customer is king.’

Tips from Claus

‘Be happy with critical questions or reactions, because this is the moment when you have contact about the content. When this happens don’t try to be smarter or question the question, instead, go deeper into it because there is a fear or worry hidden behind such a question. So, take a step towards the customer by using criticism as positive input. After all, the customer knows the most about the problem for which s/he needs your advice. The beauty of this is: what you learn during this training course, you can also apply in other situations inside and outside your company. Acceptance comes with every deal. It’s all about influencing.

This article is written by Mathilde van Hulzen, tech editor of High-Tech Systems.

Recommendation by former participants

By the end of the training participants are asked to fill out an evaluation form. To the question: 'Would you recommend this training to others?' they responded with a 9.1 out of 10.

Passive damping: increasingly part of a high tech engineers standard toolset

Trainer High Tech Institute: Kees Verbaan
Passive damping has been a standard tool for civil engineers and architects for quite some time. Mechanical engineers, however, designing for micron accuracy typically tried to avoid the use of damping. Now the high tech world has entered the domain of sub-nanometer precision, mechanical engineers are more and more discovering that passive damping is an effective medicine for contemporary precision ailments.

In recent years, passive damping is becoming more and more a standard tool for precision engineers. It is not a coincidence that the five-day training course Design Principles for Precision Engineering devotes a whole day to this subject. Due to the increasing importance of passive damping for systems with subnanometer position requirements, High Tech Institute-partner Mechatronics Academy, has developed a special training course in this topic. Top experts Hans Vermeulen and Kees Verbaan teach this new course Passive damping for high tech systems.

Hans Vermeulen first came into contact with passive damping at Philips CFT in the late nineties. Since mid-2000, he works at ASML, where this technology has meanwhile been implemented in various sub-systems to achieve sub-nanometer level precision. He is also a part-time professor at the TU Eindhoven for one day a week. Unhindered by the daily hectics in Veldhoven, Vermeulen is able to focus, among others, on passive damping. The fact that his lectures in this field started several years ago shows that passive damping is very much in the spotlight.


Hans Vermeulen informs that ASML is increasingly using passive damping to achieve sub-nanometer precision.

Colleague-trainer Kees Verbaan received his doctorate in robust mass dampers for motion stages in 2015. He works for the NTS Group, a first-tier supplier for high tech machine design. In his role as system architect, Verbaan sees passive damping technology as becoming well established in many high-end companies.


System architect Kees Verbaan who obtained his doctorate in robust mass dampers, now sees his professional field become well established.

In the world of gross dimensions (centimetres instead of nanometres), passive damping is encountered everywhere. Put your finger on a vibrating tuning fork or nail a large rug to the wall and you readily apply passive dampinging. The automobile industry frequently applies it to car doors. A layer of anti-banging film renders a good sound experience. When you close the door, you don’t hear the sheet metal annoyingly resonate: the damping layer provides the gentle sound that we associate with quality. The energy doesn’t stay in the material as a continuous vibration, but is transferred into heat via a layer of bitumen on the inside of the door. A rather extreme example of a passive damping design is to be found in Taipei 101, the tallest building in the eponymous Taiwanese capital. Because earthquakes and typhoons appear quite frequently, the 101-storey building is equipped with a tuned mass damper, a huge spherical mass of more than eight hundred tons that hangs at the top of the building on four ropes and is provided with large viscous dampers. In the event of vibrations caused by earthquakes or severe storms, the sphere moves out of phase, absorbing a large part of the building’s kinetic energy.’Similar techniques are also now entering  high tech,’ Hans Vermeulen says. ‘In recent years, damping layers – so-called constrained layers – have been applied in high-precision stages and tuned mass dampers are being used to suppress disturbing vibrations at specific frequencies to increase the accuracy of the entire system.’

In high tech mechanical engineering, the application passive damping has been avoided and worked around for a long time. This is mainly due to the fact that designers were able to reach their goals (and often still can) with the traditional approach of using relatively stiff structures in metal or ceramics and metal springs to get predictable behaviour.

Plastics, rubber and composites

Although the use of plastics, rubber material and composites can significantly reduce unwanted vibrations, the application has never been that popular, because the hysteretic behaviour of these materials potentially makes precision systems unpredictable. Another reason is that, for a long time, analytical tools such as finite element analysis and the necessary computers didn’t have sufficient computing power to calculate the complex behaviour necessary to properly predict the influence of passive damping in structures made from exotic materials. In recent years, however, things have changed.

It may be a truism, but it’s still very true: in the world of high tech systems, the demands for precision are constantly increasing. Semiconductor manufacturers want lithographic machines that are able to make patterns in a reliable way at sub-nanometer level precision. Biotechnologists need microscopes that allow for imaging DNA structures at atomic level and medical professionals jump rely on  diagnostic equipment with, if possible, molecular resolution. In all sectors, demands are rising, in such a way that mechanical designers and architects can no longer rely on their standard toolset.

''In the traditional toolset of a design engineer there used to be three drawers of tools. Now it appears there are six..'

It appears that passive damping can make a very significant contribution here. The approach has proven effectiveness, also in the high tech equipment. ‘The nice thing about damping is that a whole new box of tricks is being used,’ Verbaan says. ‘Precision engineers really benefit from a few additional pieces on their chessboard. I like that, because in a manufacturer’s traditional toolkit, there were only three full drawers. Now it turns out that there are three more and they are full of new types of tools that he didn’t use before.’ He underlines that damping is an extension of the solution space, not a replacement. ‘If you don’t master traditional design, the additions will not bring you much.’

‘When requirement were less demanding, designers were used to the  predictable solution space consisting of masses and springs,’ Vermeulen says. ‘In traditional design you have to deal with linear relationships, such as relationships between force and position or stress and strain. To limit the negative effect of amplifications at resonance, designers make sure that the natural frequencies in the system are sufficiently high enough. That translates into light and rigid designs, using low mass solutions and highly stiff materials and geometries. ‘

Monolithic leaf spring

Hooke’s law states a linear relationship between force and position or  stress and strain for linear elastic materials. This means that an elastic material  returns exactly to its original position, which is nice, because as long as you know the forces that act on the system, you can accurately predict the position. Take the example of a monolithic leaf spring, a solid block of metal that has been processed with holes and slots into a mechanism based on masses and springs. Such a structure exhibits reproducible linear behaviour, free from hysteresis. From control perspective,  however, this approach might create a problems in case higher precision is required.


Typical construction with integraed tuned mass damping. Photo: Janssen Precision Engineering.


Example of a monolithic leaf spring. A solid block of metal is processed with holes and slots into a mechanism based on masses and spring. Such a structure exhibits reproducible linear behaviour but has the disadvantage that it ‘sounds like a clock.’

In this type of design, the control system suffer from long lasting vibrations. Resonances might be excited by forces within the system itself, such as imposed motion profiles, but also due to external influences, for example floor vibrations or air displacement. Without damping, these vibrations remain in the system for a long time. The vibration cannot get rid of the vibrational energy.

Mechanical engineers tend to say: ‘it sounds like a clock,’ and in this case this is not a positive observation. High frequency resonances are generally difficult to get rid of via active control. That is why system designers always try to make sure that these types of resonances are outside the area of interest. This means that the first natural frequency is typically designed roughly five times above the bandwidth. Hence, the control system is not affected in the lower frequency range. Vibrations caused by disturbances do occur, but the effect is not limiting performance.

If the demands for accuracy increase, however, designers using the traditional approach will be forced to achieve higher natural frequencies within the design. ‘The demands are increasing,’ says program manager Adrian Rankers of Mechatronics Academy. ‘That will come to an end, because it is not manufacturable anymore.’

Aversion

The traditional approach was sufficient for high tech system designers for many years. But in their search for increasing precision, all high-end system suppliers are now looking at the possibilities of implementing passive damping. Vermeulen: ‘I dare to say that it is becoming standard in the high tech systems industry. Not everyone is familiar with it, but it is expanding.’ Verbaan: ‘The big players such as ASML, Philips, TNO and ThermoFisher have the time to develop their knowledge and conduct research.’

Vermeulen: `Damping means that you deviate from the linear elastic behaviour of materials as  defined by Hooke’s law. This is because the material converts part of the energy into heat. If you plot force against  elongation in a graph, the dissipation is expressed in the hysteresis loop. The surface of this loop is proportional to the dissipated energy: the damping that you can provide to the structure.’In addition, stiffness and damping properties of rubber are temperature- and frequency-dependent (for specialists: linear viscoelastic models can be used for rubbers). As a result, these types of damping materials have been avoided for a long time: a system can have different states under the same load conditions. Vermeulen: ‘That means uncertainty in position.’  Precision engineers have an aversion to this. ‘With damping you deviate from the linear relationship. You pass through a hysteresis loop when the force increases and decreases again, and you don’t know exactly how since not all the forces that affect the system are exactly known. Often there are disturbances from the outside and then you can end up in a position that was not predicted beforehand. We have actually sought to avoid that uncertainty for a long time. As a result, everyone in the high tech systemssector has avoided damping and has designed things traditionally using masses and springs. But at a given moment, the possibilities come to an end.’

Venom

The venom, however, is in the above mentioned hysteresis loop. It’s more complicated to predict behaviour correctly, because the system can be found in different states as mentioned. This means that operating and controlling is complex in environments where floor vibrations and small variations in air pressure or temperature cause major disruptions. A soft exhalation over a wafer stage already provides a standing wave with an amplitude of several tens of nanometers while the stages need to be controlled  at sub-nanometer level.Over the last few decades, the pursuit of the holy grail of completely predictable behaviour of guide ways has been expressed in the avoidance of friction as much as possible – also providing  energy dissipation, hence damping. ‘In many applications, Coulomb friction is not desired,’ Vermeulen says. ‘Also, rRolling elements don’t work in every situation. That is why air bearings are popular. They hardly have any friction.’IBM already used air bearings in its hard drives  in 1961. Lithographic equipment developed in the Sixties and Seventies at the Philips Physics Laboratory were equipped with virtually frictionless oil bearings, and use  air bearing technology in multiple systems these days. . Vermeulen: ‘With the classical box of tricks to design frictionless guide ways, avoiding play, and applying high-stiffness springs with limited mass, we were able to make the behaviour predictable for a long time. But for nanometer applications and beyond, this is no longer sufficient.’

Wobbly pizza disk

Until recently, the classic approach was fine for designing motion stages for wafer steppers and –scanners. By using structural metals and ceramics, such a stage can be made lightweight and stiff. The natural frequencies are high enough not to be limiting for high-bandwidth control. However, the requirement for subnanometer precision make the introduction of more rigorous steps necessary.

'At the nanometer level it is as if you have to keep a wobbly pizza disk quite with your hands.'

Verbaan, during his PhD, investigated the influence of passive damping on a positioning system for 450 millimetre wafers. Such a stage has outer dimensions of 600 mm squared. ‘At the nanometer level it is as if you have to keep a wobbly pizza disk quite with your hands,’ says Verbaan. He compared various materials, , and investigated and optimized with finite element analyses the influence of mass distributions on performance.

Such a large system is susceptible to multiple resonance frequencies. To be able to control the stage accurately, these resonances must be suppressed. ‘For one frequency it is clear how that is done, and you can also put that into a simple model. But if you have multiple resonance peaks across a broad frequency band, that is virtually impossible. Then you get a model that is too complex to handle.’

That is exactly what engineers encounter in practice. The first ‘hurdle’ that limits system performance is the first natural frequency, the frequency at which an object starts to vibrate violently when the frequency is increased.  The traditional approach is to try to increase this frequency. If the means for this are exhausted, attenuation can help to suppress the resonance amplitudes. The first eigenfrequency of a square wafer table is, for example, the torsion mode, for which two pairs of opposite corners move in phase. But at higher frequencies everything starts to rattle, due to the numerous parts and components that are attached to the table such as connectors and sensors. ’Multiple small masses that vibrate at kilohertz. They will ultimately determine the dynamic behaviour. You are not able to solve this via active filtering in the control system because there so many of them. With passive damping, however, you can solve all of that,’ says Vermeulen.

Hans Vermeulen shows how damping can reduce a resonance  peak on a graph.

Verbaan: What helps is that damping materials such as rubbers and liquids and the dampers you design with these materials, typically behave very  suitable at those high frequencies, primarily because of the frequency-dependent material properties. At low frequencies, they behave like a low-stiffness spring, and therefore give in a little bit, but at higher frequencies, they become viscous.’ Vermeulen and Verbaan’s training course makes it clear that, although you can make the field of damping extremely difficult, there are also very good rules of thumb and several very useful design principles. Verbaan: ‘Our goal is to outline the entire pallet of options and ensure that students attending the course can get to a  solution using the right approach. You can let modern computers calculate for days or even for weeks, but then you have to be a real specialist. We want to provide the course participants with various possibilities for applying damping. They are taught the backgrounds of modelling, and also the simple approach to the problem so that they can apply damping correctly. ‘

‘Potential students are people with a design principles background on the one hand,’ says Rankers. ‘They want to apply damping in practice. On the other hand, system architects will also be interested, so that they are aware of the possibilities that damping can offer.’


Kees Verbaan draws a motion training stage that needs to be kept steady  in a vertical position. All kind of forces act on such a table, varying from horizontal motors to accelerate, to vertical actuators that keep the wafer on the table at the correct height. At the first vibration mode, the opposite corners move up or down simultaneously, while the other corners move in the other direction. The result can be in the order of tens of nanometers, while the stage requires subnanometer position control.

Vermeulen and Verbaan underline that passive damping is not a ‘miracle oil’. An integral design approach is indispensable. ‘I have heard engineers  saying: leave that mistake in for now, we’ll solve it later on with controls,’ says Verbaan. He says that people sometimes come to him with systems that don’t achieve the desired performance and ask him if they can use passive damping to fix it. Verbaan: ‘Sometimes, this is dealt withtoo easily. You cannot simply forget the basicss of sound mechanical design. It all starts with light-weight and stiff design  which indisputably remains necessary, also for proper functioning of damping. The pallet of options is getting bigger, but damping is not a replacement.’In the course ‘Passive damping for high tech systems’, Verbaan and Vermeulen will explain multiple damping mechanisms in detail, such as material damping, tuned mass and robust mass damping, constrained layer damping, and Eddy current damping. Starting with damping implementations in other application areas, such as civil engineering and automotive, the focus is on design, modelling and implementation of passive damping in high-tech systems.  Stan van der Meulen, co-trainer of the course, will focus on the application of viscoelastic damping in a semiconductor wafer stage.

This article is written by René Raaijmakers, tech editor of High-Tech Systems.

Recommendation by former participants

By the end of the training participants are asked to fill out an evaluation form. To the question: 'Would you recommend this training to others?' they responded with a 9.2 out of 10.

‘Applied optics’ training shifts focus to demos and experiments

optics training
Experts from TNO and trainers from T2prof are putting the finishing touches to the renewed ‘Applied optics’ training courses to commence in February in Delft and in Eindhoven. The focus shifts. There is a decrease in hard maths, giving way to demos and hands-on experiments.

The ‘Applied optics‘ course from T2prof originates from the Philips Center for Technical Training. High Tech Institute launches the course in an exclusive partnership with T2prof. The first edition dates back to 2003. Shortly before that, ASML had indicated to Philips CTT that it needed a course to give electronic engineers, mechanical engineers and chemical engineers a better understanding of the optical R&D world in which they needed to operate.

The idea behind ASML’s request was to prevent Babylonian speech confusion within research projects. If non-optical engineers were to know more about lenses, reflection, refraction, collimators, lasers and similar, they would be able to work more effectively with optical specialists within the company. This is part of a growing trend in the high tech world. Companies derive their innovative strength less and less from individuals and more and more from multidisciplinary teams. If people are able to work together more efficiently, this in turn benefits development and innovative strength.

Experts from TNO and trainers from T2prof have spent the last few months updating the training to make the content more in line with the latest technological developments. Most of the difficult maths has disappeared. This has created room for more practical matters such as optical systems, aberration correction and the interaction between light and matter. The training course runs both in Eindhoven and in Delft (TNO).

A new timetable also applies to the TNO training in Delft. In Eindhoven the course is spread over sixteen afternoon sessions for a period of eight months. In Delft that becomes eight sessions spanning both afternoon and evening, spread over sixteen weeks. The training is known to be challenging, but in recent years it has, on average, been valued by the participants at more than 8 on a scale of 10.


The ‘Applied optics’ course in Delft will be organized upon request (eight sessions spanning both afternoon and evening). Also, enrollment is open for the ‘Applied optics’ course, starting twice a year in Eindhoven (fifteen afternoons).

Historical baggage

There are normally two types of trainee, says Jean Schleipen, who has, for the past four years, been one of the three trainers in the ‘Applied Optics’ course. ‘One half readily appropriates the content and gets right down to the maths and the homework. They want to master the profession. Others need more of a global picture. Think of marketing people who find it enough to be roughly updated in all optical areas. My aim is, in addition to transferring knowledge, to fascinate and enthuse all participants for our beautiful and important field of science.’

In order to make the content stick and to place it in a broader context, Schleipen deems it essential to give students both historical baggage and deeper background information. ‘We can’t teach all the formulas and mathematical background to non-opticial engineers. But it is useful if they know where these calculations come from. If they know that Ampère, Coulomb and Faraday made discoveries in the eighteenth and nineteenth century in the field of electricity and magnetism and that afterwards a genius physicist, James Maxwell, came along, a physicist who was able to mathematically describe electromagnetic forces. And that when this physicist was juggling with his formulas, all pieces of the puzzle fell in place and a new physical constant dropped out, closely resembling the speed of light as measured at that time. He felt that there must be a connection. Maxwell’s equations still form the basis of modern optics.’

‘At the end of the nineteenth century, continues Schleipen, Hertz discovered the photoelectric effect, followed by the rise of quantum mechanics at the beginning of the twentieth century.’ ‘New insights found that particals, such as electrons, could be described as mass and as a wave. Conversely, light could behave both as a wave or a particle. Students tell me that they appreciate this kind of knowledge.’

Schleipen also wants to give background information when explaining optical-physical phenomena. ‘If you use a lens when focussing a laser beam, then the spot has a finite width. But why? You can then indicate that it is due to diffraction, and/or refraction of light. But I also want students to understand the cause of this phenomenon. That it belongs to the wave character of light. I am firmly convinced that this helps to create understanding.’

'Small demonstrations can be very illuminating.'

In the new training course, developed in collaboration with TNO, Schleipen has made more room for demos. His experience is that small demonstrations can be very illuminating. ‘In a simple practical demo I let students determine the distance between the tracks of a CD, DVD or Blu-ray disc. We shine a laser on a DVD, the students measure the angles of the diffracted beams, just with a tape measure. And gosh: really, to an accuracy of less than a tenth of a micrometer!’ He laughs.’’Such a test lasts ten minutes, everyone has woken up and can move onto the next module.’


Interesting natural phenomena are also discussed during the course. ‘Why do we see a rainbow, why are there sometimes two and why are the colors of these two arches inverted?’

A stable discipline

In our country, optical technology can look forward to renewed attention. For example, the top High Tech Systems & Materials sector published the Photonics National Agenda last July and in 2018 substantial subsidies for photonic chips were attributed. However, Schleipen reacts quite neutral to the question as to  whether we are dealing with a renaissance of his field. ‘We certainly play the game, but in the field of optics we are a small country, for example, when compared to Germany. Naturally we have ASML and Signify and a few dozen small and medium-sized companies that do very well in the field of photonics, but certainly not hundreds, as is the case with our Eastern neighbours.’

With this statement, Schleipen doesn’t mean to play down optics. ‘It’s primarily a very stable field because it provides a basis for an extremely wide range of subjects with many areas of application. You can find optical technology in metrology, sensors, inspection, safety, data communication, imaging, in the automotive industry and in the biomedical and life sciences.’

The new course also responds to recent developments in the field of optical phenomena and instrumentation. ‘To give an example: Imec in Leuven developed a new cmos image sensor a few years ago, which had a large range of tiny spectral filters integrated on it. These compact and potentially inexpensive hyperspectral sensors have now found their way into a whole range of new applications in healthcare.’

'Students now experience the material themselves by being able to physically turn the knobs during the weekly practical sessions.'

’In fact, the optical field is so broad that not all sub-areas can be covered in sixteen three-hour modules. ‘We could easily add four more modules, but we have to stop somewhere. We don’t deal with life sciences and biomedical technology, but the basic principles are addressed adequately. And above all: students now experience the material themselves by being able to physically turn the knobs during the weekly practical sessions. After the course, participants are sufficiently equipped, in all teams where the discussions about optics go into depth. And at home, they can explain in minute detail where the colours of a rainbow come from.’

This article is written by René Raaijmakers, tech editor of Bits&Chips.

Recommendation by former participants

By the end of the training participants are asked to fill out an evaluation form. To the question: 'Would you recommend this training to others?' they responded with a 8.9 out of 10.

Huub Janssen on his lead in Design Principles for Precision Engineering

trainer precision engineering
Huub Janssen from Janssen Precision Engineering is one of the former figureheads of the Design Principles for Precision Engineering training. His ambition was to spread know-how in the vein of Wim van der Hoek.

A longstanding wish of Huub Janssen of Janssen Precision Engineering has been fulfilled: he shared his knowledge in the same way that his mentor Wim van der Hoek did.

Janssen deems Van der Hoek ‘awe-inspiring.’ In the early 1980s, he was looking for a niche in which to spend his last university years at the Eindhoven University of Technology and came across a professional who worked mainly in precision mechanics. “Wim invited me to his monthly mornings. There he would put a large sheet of paper on the table and scribble down all kinds of problems. We would discuss them with a handful of students who each had their own graduation assignment and for two to three hours we would talk about progress and technical problems.’


Huub Janssen is the new figurehead of the Design Principles for Precision Engineering training course.

The main focus was on the content, the technical approach, the concept and how it is put into practice. ‘Everyone offered free solutions. One graduate would put down his problem and then five or six men would jump up to solve it in various ways. It was some game. That stimulation from Wim really appealed to me. I took to it like a fish to water. It goes without saying, I felt at home.’

'I have always enjoyed discussing technical problems with young people. I also do that when coaching my employees.'

In the ‘80’s, Janssen worked at ASML, made production equipment for LCDs at Philips in Heerlen, and then started an engineering firm dedicated to precision instrumentation. Education has always attracted him, but in recent decades entrepreneurship has taken priority.  ‘Just like Van der Hoek, I have always enjoyed discussing technical problems with young people. I also do that when coaching my employees,’ says Janssen.

Now that employees have taken over part of his duties, his thoughts have automatically turned to knowledge transfer. When approached by Jan van Eijk and Adrian Rankers of the Mechatronics Academy, partner of High Tech Institute, Janssen didn’t have to think twice.

Limburg’s flan

We are talking in the very space that Huub Janssen named after his great inspiration, Wim van der Hoek. Over Limburgs flan and coffee, the precision engineering entrepreneur raises a subject that engineers often bring up in conversation: the passion he already had for technology in his youth.


In Janssen Precision Engineering’s new meeting room, completely surrounded by glass. Huub Janssen named the space after his mentor.

During his high school years Janssen photographed birds. His challenge was to capture them in flight. He didn’t want to sit behind the camera all day long, so he came up with a solution. In a nesting box, he set up a Praktica – the SLR camera that still fitted a more or less within his budget – and put together a shutter mechanism with a light beam and photodetector. ‘Everything was arranged so that the Praktica shutter closed at the precise moment that the bird flew through the beam. An electric solenoid triggered the self-timer. Not with a normal motor, because it had to be bam! Done.’

He received his entrepreneurial spirit from home. His parents had a fruit company and his father often built machines himself, such as a machine to sort apples. During his last years at university, Huub devised a measuring scale which made it easier to fill fruit trays with a specific weight. Not ordinary scales because with those you would need to calculate back and forth and Janssen wanted to avoid that. ‘You could buy those kinds of scales for three thousand guilders, but that was a lot of money back then. I wanted something that would enable you to see in one go whether you had to add or take away a few apples. I was always thinking about things like that.’

He solved it with leaf springs, electronics and an optical sensor. ‘There were all kinds of Van der Hoek design principles in it,’ he laughs, referring to the professor whose Monday morning sessions he sat in on that time.

During his final years at university, Janssen developed an instrument that could map out wear and tear in fillings and molars. ‘Interferometry and optics were part of the solution. I had to position in six degrees of freedom within fractions of a micrometer, and I could completely break loose with new ideas. Moreover, I also had a real customer so it had to work eventually.’


Huub Janssen with the piëzo-knob, a component on which he has a patent; a revolutionary concept based on piëzo elements and a rotating mass, steps of 5 nanometres can be made.

After graduating in the eighties, Janssen worked at ASML on the first PAS2500 wafer stepper. ‘I had learned a lot from Van der Hoek, but at ASML I have been able to see where things can go wrong. With Van der Hoek you learn to design something statically determined. For example, you get stability with three support points. But not everyone is happy with a three-legged table. At ASML I learnt to understand when to apply specific design principles and when not to.’

'I learnt that you cannot always apply Van der Hoek’s design principles in any situation. You have to know when you can and when you can’t.'

For the PAS2500, they had initially developed a new interferometer  to measure the position of the stage in directions x and y. ‘We did this completely in accordance with the Van der Hoek design principles, with elastic elements and so on.  There was no hysterisis, but everything kept vibrating. There, I learnt that you cannot always apply Van der Hoek’s design principles in any situation. You have to know when you can and when you can’t,’ explains Janssen.

After ASML, he joined Philips in Heerlen, where he developed production equipment for LCDs. A few years later he started his own engineering office. ‘During my final university years, I also worked for a real customer with a real technical problem, including the demand for hardware. That was just my thing.’

In 2010 Huub Janssen received the Rien Koster prize in recognition of the high level at which he practices precision technology in his company Janssen Precision Engineering (JPE). In addition to the large amount of advanced work done for clients, the jury also emphasised Janssens’s attention to the coaching and training of his employees. JPE has since recorded thirty patents for its inventions.

Within JPE, more than ten years ago, Janssen started collecting and documenting technical principles and solutions. Initially for his employees, but also for the outside world. Whenever Janssen or his colleagues delve into something or have to come up with a technical solution, they record it. ‘We always have to figure something out or look it up again. How did that technical calculation go again? I thought: let’s do it properly once, and then the next time employees need it, they will also benefit from it. I started documenting the cases on one A4 sheet. ‘Everything is divided into categories such as ‘engineering fundamentals,’ ‘construction fundamentals,’ ‘dynamics and control’ and ‘construction design & examples.’

You have to invest time in it, ‘but then you also have something,’ says Janssen. ‘The technical problem, all the formulas that matter, all have to fit on that sheet of A4. That means only the essential information. In the meantime, it totals about fifty sheets of A4. Janssen thought that the information also had marketing value and started to publish it. That is how the precision point came about, a page on the Janssen Precision website where everything is accessible. ‘Even a professor at MIT mailed me to ask if he could use the knowledge in his lectures.’ Janssen also bundled the A4 cases in a handy booklet under Albert Einstein’s motto ‘never remember anything you can look up.’ He regularly receives orders from schools, competitors and customers.


Under Albert Einstein’s motto ‘never remember anything you can look up,’ Huub Janssen has documented precision cases. Each case fits on one sheet of A4. The knowledge is available at Precision Point on his website, and also available in print.

It is difficult to say whether the efforts also generate extra business. ‘We can, however, see that interested parties look at our core activities in high tech engineering and at our products after reading our precision point link.’

He said yes to Van Eijk and Rankers’ request to become the figurehead of the Design principles for precision engineering training course because education always attracted him. Much of the knowledge and experience in the design principles training course comes from the Wim van der Hoek ideology. ‘Just like Van der Hoek, I have always enjoyed discussing technical problems with young people. I also do that when coaching my employees,’ says Janssen.

For old students and colleagues, Van der Hoek can’t put a foot wrong. When they praised him at a party in honour of his 80th birthday, the Emeritus Professor responded: ‘I am praised in heaven in a shameful way.’

But after some thought, Janssen manages to dig up a criticism. ‘He liked to talk. He talked pretty quickly, so it was quite difficult for beginning university students, who still had to master the profession, to follow everything. You really had to pay attention, because a lot of information came flying at you in those few hours.’

'Van der Hoek quickly came up with his own ideas about the path that solutions should take.'

Van der Hoek liked to talk, rapidly pointing in which direction to go, and he also had something to say. ‘He quickly came up with his own ideas about the path that solutions should take, and that was often astonishing.’


‘Thirty years ago, positioning at a micrometre was something from another planet.’

What was so special about Van der Hoek’s approach?

‘It has to do with the field. Thirty years ago, positioning at a micrometre was something from another planet. It is a field where you cannot simply apply normal functional elements such as bearings and gears. Even at this moment, it is still unexplored territory for many parties. Worldwide. Until the fifth year at university, we only learnt what other prospective engineers were learning: gears, drive shafts, v-belts and so on. But if you are going to position at a micrometre or a fraction thereof, then you can’t simply use those components. Then you get completely different solution directions and things such as reproducibility and avoiding backlash become important.’

You want to shape the design principles training in the spirit of Van der Hoek. What do you mean?

‘We are talking about design principles for precision engineering. That is the world of complex machines and instruments for the chip industry, astronomy and space travel. To position more accurately than a micrometre, you cannot simply use standard functional elements such as bearings. Then you come to elastic elements, no friction and those sort of things. After that it becomes exciting, because you are very close to physics.’


Janssen: ‘I can still remember that Van der Hoek asked his students to crawl in thought into a ball bearing.’

Manufacturers must recognise that they cannot buy standard parts from a catalogue. They have to think a bit further, analyse all the problems that may arise. Then you have to imagine things in your head, do ‘thought experiments’: where can things go wrong? If you can see that, the way to the solution is close. ‘I can still remember that Van der Hoek asked his students to crawl in thought into a ball bearing, to imagine the outer ring and inner ring with all the balls in between. We had to make ourselves so small that we were sitting between those spinning bullets. Then you see that the ball on one side is against the ring and on the other side has room to play. Next you see that a bullet isn’t completely round, it has indents and doesn’t turn well. If it is an indent of a micrometre then it means a micrometre of error. You don’t have to have much experience, but you do need a lot of imagination to be able to do thought experiments.’

What is specific about your contribution to the training?

‘The way solutions are reached is important. I don’t have a lot to do with formulas. Of course, they are needed, but calculating is the last ten percent of the job. Primarily, designers need to get a feel for the details. What should they pay attention to? How do they solve matters? You first need to know where things can go wrong and then come up with a good conceptual direction. I especially want to instil intuition. Calculation techniques will come after that.’

That’s why I want to introduce case studies. Van der Hoek did that in his Des Duivels prentenboek in which he published unsuccessful projects. ‘Participants thus get to work alone and in groups. Then we have a large group discussion. I don’t want a lecture, I prefer interaction.’

This article is written by René Raaijmakers, tech editor of High-Tech Systems.

Recommendation by former participants

By the end of the training, participants are asked to fill out an evaluation form. To the question: 'Would you recommend this training to others?' they responded with a 9.5 out of 10. Besides Huub Janssen, trainers include Dannis Brouwer (University of Twente), Piet van Rens (Settels Savenije), Kees Verbaan (NTS), Chris Werner and Roger Hamelinck (Entechna Engineering).

On to battery-less IoT devices: Ultra-low power

trainer Herman Roebers Ultra-low power for the Internet of Things
Herman Roebbers is an advanced expert at Capgemini Engineering and has been working on embedded systems and parallel processing since the mid-1980s. He is also an external advisor to the EEMBC working groups Ulpmark, Iotmark and Securemark, and ultra-low power trainer in the workshop “Ultra-low power for the Internet of Things.

In the pursuit of battery-less IoT, it is important to use energy as efficiently as possible. By using an encryption library as an example, Herman Roebbers shows how small tweaks to the tooling and chip settings alone can have a huge impact on consumption.

How can I reduce the energy consumption of my IoT system towards ultra-low power? This question is becoming more and more relevant as we continue to increase our expectations of IoT devices. Ultimately, the goal is that systems require so little energy, they can harvest it from their environment and no longer need batteries.

To achieve this, we need to work in two directions, increasing harvest yields and reducing consumption. The first is being addressed: new materials and methods to make and post process solar cells produce ever-higher yields. Progress is also being made in the field of RF energy harvesting. The Delft startup Nowi, for example, has made special chips that are very good at this. Furthermore, a lot of research is being done on new materials to convert temperature differences into energy more efficiently. We are also working hard on increasingly efficient converters that convert harvested energy into required voltage(s) and ensure efficient energy storage, for example, in rechargeable batteries or supercapacitors.

A case study for ultra-low power

An earlier Bits&Chips article gave an overview of all aspects that are important to save energy: from chip substrate, transistor selection, processor architecture and the circuit board to driver, OS, coding tools and coding styles up to the application. In the meantime, the table has been expanded somewhat.

A recent case illustrates the effect of different mechanisms on energy consumption. EEMBC just released a benchmark to determine the energy needed for several typical tls (transport layer security) operations. Tls is part of an https implementation and, as such, is essential for setting up a secure connection. The benchmark has been ported to an evaluation board that supports cryptography through the Arm Mbedtls library.

We can use that process to show what each optimization step delivers. For this purpose, we first perform a baseline measurement each time. The benchmarking framework uses an energy monitor from Stmicroelectronics and an Arduino Uno. The Arduino is used as a uart interface towards the device under test (dut, Figure 1).


Figure 1: The setup for measuring power consumption

We also use the development environment Atollic Truestudio 9.0.1 for STM32, which uses a proprietary version of the GCC compiler, as well as the Stm32cubemx software, which can generate (initialization) code for peripherals and thus considerably simplifies configuration.

Step 1: Look at the compiler settings

If we do a baseline measurement with a non-optimized version (setting -O0) at 80 MHz (highest speed) and 3.0 volts, this results in a Securemark score of 505. If we change the optimizer setting to -O1, this makes a huge difference: we’re going to 1336! The optimizer settings for -Og and -O2 don’t make much difference, but if we go to -O3 or -Ofast, things will go even better: 1490.

This demonstrates what you can achieve with the compiler settings alone. The ideal settings, however, can differ per function. In our case, for example, there is no difference between -O3 and -Ofast, but this is not always the case. So, it may pay off to choose the settings per function or per file separately.

With the compiler settings -O2, -O3 and -Ofast, programming errors may appear that do not occur with other settings. Timing can change, and it is necessary to qualify variables that are used in multiple contexts (e.g. normal and interrupt context) as volatile to avoid problems.

Step 2: Look at the pll

Microcontrollers nowadays have very extensive settings for all kinds of clock signals on the chip. One of those settings concerns the frequency multiplier (pll). This can be used to multiply and divide a low frequency to create all kinds of other clock speeds.

In our case, the frequency of the internal oscillator is 16 MHz. To make 80 MHz out of that, we cannot simply multiply it by five, unfortunately. We have a choice of two settings: the first is to divide by 1, multiply by 10 and then divide by 2. The second option is to divide by 2, multiply by 20 and then divide by 2 again.

That gives different scores: 1462 against 1490. The result in both cases is 80 MHz, but the second method is two percent more economical. The lower the clock frequencies, the less energy you lose, and the sooner you divide the clock frequency, the better.

If you have enough time at your disposal, you can also use the processor without pll, because that’s actually quite an energy guzzler. With the built-in oscillator, we can generate a maximum frequency of 48 MHz, which results in a 4 percent higher score. The disadvantage is that it takes a bit longer: 80/48 = 1.66 times longer to be precise.


The Nucleo L4A6ZG development board from STMicroelectronics offers a lot of tools to optimize energy use.

Step 3: Turn off unnecessary clocks

Now that we have explored a few things, we can choose a setting and further optimize from there. We start quite conservatively: -O1 and a frequency of 80 MHz via our second pll setting. This brings our Securemark score to 1336.

The next step is to turn off all superfluous clocks. In our case, the clock to the uart and all i/o ports can be turned off. This saves between 2.5 and 2.9 mW and gives a score of 1448. This costs 1448/1336 = 1.08 times less energy (8 percent gain).

Step 4: Optimize the memcpy function

During the execution of cryptographic functions, the memcpy function is regularly used. Opting for an optimized version yields a five percent profit in the case of GCC. The IAR compiler already provides an optimized version. This allows us to increase our score to 1524. Profit: 5 percent.

Step 5: Tighten the thumbscrews

Now we can see if we can also do it with a low clock frequency. With that, we could lower the core voltage. For our mcu, this core voltage should be 1.2 V for frequencies above 26 MHz. For simplicity, we take 24 MHz, a standard frequency in the menu of the msi oscillator, where the pll can remain off – another 4 percent gain: 1588.

We can also test whether we can safely set the compiler optimizations a bit more aggressively. If we go to setting -O2, we arrive at a score of 1691 – another 6.4 percent gain.

Step 6: Reduce voltages

We have already prepared the clock frequency for allowing a lower core voltage. Now we are actually going to set it. The result is beautiful: 2021, almost 20 percent profit!

The power supply voltage can also be a bit lower. We started at 3.0 V, but if we go to 2.4 V, that again gives an improvement of 26 percent. We can go even further to 1.8 V if necessary. We haven’t done that here, but if we extrapolate, we can expect a further saving of a third.

Conclusion

With a few simple measures, energy consumption can already be drastically reduced towards ultra-low power. In our case study, a factor of five to seven is easily achievable compared to a non-optimized version.

'An embedded system without batteries is within reach.'

However, I have limited myself here to tooling and chip settings. With additional measures in other areas, tens of percent extra improvement can be achieved. Therefore, an embedded system without batteries is within reach.

This article is published by Bits&Chips.

Recommendation by former participants

By the end of the training participants are asked to fill out an evaluation form. To the question: 'Would you recommend this training to others?' they responded with a 7.6 out of 10.

Design Principles (course) still firmly anchored in Wim van der Hoek’s ideology

Design Principles is one of the most renowned training courses given by the Mechatronics Academy and High Tech Institute. Our ‘mechatronics men’ Jan van Eijk and Adrian Rankers have renewed the training course and asked Huub Janssen to assume the role of course director. The course remains firmly anchored in the foundations laid down by the renowned Professor Wim van der Hoek. The biggest changes are the addition of new top specialists and new focus points. The training course is now known as Design Principles for Precision Engineering.

It is pretty risky to redesign one of the most renowned training courses in the Netherlands’ high tech world. Yet we had no other option. Piet van Rens, who was for a long time the face of the course, wanted to considerably limit his work as a structural engineering trainer. He has a lot of fun in the projects he does for ASML, but his agenda is just too full.


Piet van Rens was for many years the face of the Design Principles training course. He continues as a course trainer but is no longer a course director.

Thus, Van Eijk and Rankers had to go in quest of successors. They took advantage of this situation to reformulate the training course itself. For approximately a year now, the training programme has been in the hands of a strong team of specialists from the Dutch precision world. In addition to Van Rens, a handful of top experts have been found, to immerse the course participants in trusted fundamental knowledge and insights, as well as in relevant additions to the engineering field. The new faces include Huub Janssen from Maastricht’s Janssen Precision Engineering, Chris Werner and Roger Hamelinck from Entechna Engineering in Eindhoven, the precision engineering Professor from the University of Twente, Dannis Brouwer and Kees Verbaan from NTS. Van Eijk and Rankers’s debut in June 2018 was a success. Thereafter, the new training programme was fully booked and awarded an average score of 8.4.

The knowledge and experience for the design principles training course Design Principles comes from the ideology of Wim van der Hoek, the renowned Professor of precision technology, to which Dutch high tech owes a lot of its design principles and knowledge. Van der Hoek devised a number of essential design principles in the sixties and seventies, such as the famous hole hinges, with which machine builders could achieve nanometre precision.

'It became an honour for someone to have their design and improvements in the Des Duivels prentenboek.'

In addition, Van der Hoek gained great fame by collecting unsuccessful designs and and included them in Chapter 13 of his infamous Des Duivels prentenboek. He stated that you learn best by making mistakes. The easiest and cheapest training is by getting to know those mistakes. ‘This reference work became so well known that it became an honour for someone to have their design and improvements added to it,’ says Piet van Rens.

Van der Hoek’s successors, Professors Rien Koster and Herman Soemers, are enriching that basis. ‘The new style of design principle training elaborates on the legacy which we have had in the Netherlands for decades, namely to design properly using the correct design principles,’ says course leader Adrian Rankers of Mechatronics Academy, the partner in charge of the mechatronics training at the High Tech Institute.

Huub Janssen is the new figurehead of the Design Principles training course. Like Piet van Rens, he comes from the school of thought of Professor Wim van der Hoek. The precision engineer honoured his mentor by naming the new meeting and demo room at Janssen Precision Engineering after the person who had inspired him, Wim van der Hoek.

The updated training course includes countless new elements. For example, there is more attention given to damping and to advanced elastic elements which have a somewhat larger stroke. Elastic elements are often limited in their range of motion, but there are concepts available which achieve larger strokes. This is one of the research topics of Professor Dannis Brouwer from the University of Twente, who imparts a day of training on flexure mechanisms.

Brouwer also handles energy compensation and gravity compensation techniques (think of the kitchen cabinets that you can open and close vertically and which can stay in each position whilst they move up and down easily).’That includes balancing mass-like issues,’ says Adrian Rankers. ‘As in a complex robot system where you try to get rid of the reaction forces on the floor by having another body simultaneously make the right movements that precisely compensate the forces. That can be complicated, so we have called it energy compensation. But you can also call it energy balancing.’

Rankers emphasizes that the ‘mechatronic context’ recurs throughout the training. ‘On the one hand it provides additional requirements for mechanics, on the other hand it also offers alternative solution space. If previously you needed to create a positioning system, you did that with a cam drive and a drive chain up to the element that you had to position properly. In that chain you used to encounter all sorts of friction and slack – all of which is very annoying. But in a mechatronic movement system you have sensors on your payload. They tell you exactly what the position or position error is. In principle, you won’t be worried if there is a bit of friction or play in-between, because you have the information and you can immediately compensate for it. These kinds of trends make the subject choice for the Design Principles training shift, although it is still true that you can never get a high-quality system solution with rattling mechanics,’ emphasises Rankers. ‘In the less important topics we have made some room for new subjects.’


‘We have a long history here of designing properly using correct design principles,’ says Adrian Rankers, director of Mechatronics Academy. ‘Wim van der Hoek started that off, Rien Koster and Herman Soemers are continuing it.’

The course director Huub Janssen has given himself the goal of giving shape to the design principles course in the spirit of Van der Hoek. ‘We are talking about design principles for precision engineering. That is the world of complex machines and instruments for the chip industry, astronomy and space travel. To position more accurately than a micrometer, you cannot simply use standard functional elements such as bearings. Then you come to elastic elements, no friction and those sort of things. After that it becomes exciting, because you are very close to physics.’

Janssen says that manufacturers must recognise that they cannot buy standard parts from a catalogue. They have to think a bit further, analyse all the problems that may arise. Then you have to imagine things in your head, do ‘thought experiments’: where can things go wrong? If you can see that, the way to the solution is close.

During his part-time professorship, Van der Hoek asked his students to do thought experiments. Janssen: ‘I can still remember that Van der Hoek asked his students to crawl in thought into a ball bearing, to imagine the outer ring and inner ring with all the balls in between. We had to make ourselves so small that we were sitting between those spinning balls. Then you see that the ball on one side is against the ring and on the other side has room to play. Next you see that a ball isn’t completely round, it has indents and doesn’t turn well. You don’t have to have much experience, but you do need a lot of imagination.’

It is no coincidence that Janssen wants to enrich the training by injecting experience and exercises. ‘The way solutions are reached is important. I don’t have a lot to do with formulas. Of course they are needed, but calculating is the last ten percent of the job. Primarily, designers need to get a feel for the details. What should they pay attention to? How do they solve matters? You first need to know where things can go wrong and then come up with a good conceptual direction. I especially want to instil intuition. Calculation techniques will come after that.’

'I don’t want a lecture, I prefer interaction.'

He mainly uses case-studies, just as Van der Hoek did in his Des Duivels prentenboek. ‘Participants thus get to work alone and in groups. Then we have a large group discussion. I don’t want a lecture, I prefer interaction.’

Exercises, interaction and working with practical cases are distinguishing factors in the structural training course that the Mechatronics Academy and the High Tech Institute bring to the market. In other organisations, the training is also available as a three-day variant.

Piet van Rens also has experience as a trainer for this three-day variant. He wants to emphasise that participants in the short version really miss something. ‘Some customers require employees sent by the temporary work agencies to complete a design principles training course. This means that some engineering firms then choose on a costs basis or for an evening version.’

Van Rens thinks this is a not such a sensible choice. Practical exercises are the most valuable component of the Design Principles training course. They ensure that the contents really sink in and the participants actually understand and apply them to their work. This hands-on element is precisely what is killed in the shortened version. “The three-day and evening training courses are not bad, but they skimp on content, cutting corners to fit into less time. This effect is noticed more when learning after a normal working day, people are tired in the evenings. If you only give lectures, then the outcome is really less effective,’ states Van Rens.

This article is written by René Raaijmakers, tech editor of High-Tech Systems.

Recommendation by former participants

By the end of the training participants are asked to fill out an evaluation form. To the question: 'Would you recommend this training to others?' they responded with a 9.5 out of 10.

ITRI sees crucial role for system architecting to achieve industrial transformation

System architect(ing) training - ITRI Testimonial
Two years in a row, Industrial Technology Research Institute (ITRI) from Taiwan, invited High Tech Institute to help introduce System Architecting-thinking in their organization. We asked executive vice president Pei-Zen Chang to tell us about ITRI’s ambitions and the role of system architecture in keeping the Taiwanese industry competitive in this age of fierce international competition. This is where the system architect(ing) training comes in.

It was July 2017 when Ger Schoeber arrived at Taipei international airport to lecture his first system architecting training in Taiwan. The next day he arrived at his final destination, the renowned Industrial Technology Research Institute (ITRI) at Chutung in the Hsinchu region. There the Dutchman faced a firing squad. Figuratively, that is.

All began very friendly. ITRI’s former executive vice president Charles Liu kicked off a week of training by introducing Schoeber to sixteen participants, all senior executives and cross-domain project leaders of the Taiwanese institute. Liu told Schoeber with a smile that his colleagues had all prepared well. They had read the material and were actually not so impressed. Some even had asked Liu why they had to clear their agenda for the whole five days for this stuff. “I wish you good luck this week,” Liu pronounced firmly.


Pei-Zen Chang, executive vice president of ITRI: “System architecture knowledge will contribute to value creation across Taiwan’s industry. Photo: ITRI

To Schoeber, Liu’s message was clear. He had to prove that his System Architecting (Sysarch) training was worth the investment for ITRI. Schoeber faced five days of lecturing a group at the level of vice president and higher. At that moment in time, he had to swallow, Schoeber admits two years later.

ITRI and High Tech Institute got acquainted in 2016. Dr. Jonq-Min Liu, at that time president at ITRI, wanted to strengthen the institutes systems thinking knowledge to overcome cross-domain problems. Liu directed that ITRI College should do an assessment and this subsequently evolved into a recommendation to seek cooperation with the High Tech Institute of the Netherlands.

ITRI College identified a system architecting training at High Tech Institute that originated from abundant experience in complex systems development at Philips and ASML. The Dutch institute is a spin off company from Philips since 2010. As the successor of Philips Centre for Technical Training it is active in post academic education for technicians in the open market.

ITRI wanted to introduce the System Architecting course with the goal to train leaders of cross-disciplinary projects. It should help them to handle cross-domain planning, management, communicating and resolving system problems. Edwin Liu, the president of ITRI, firmly believes in a systems approach for his organization: “In addition to continuously deepen scientific and technological innovation and R&D, ITRI must carry out cross-unit and cross-disciplinary cooperation to bring about industrial transformation.”


Ger Schoeber teaching system architecture at ITRI in July 2018. Notice the abundance of paper on the wall, resulting from discussions and learning exercises. At the end of the week usually the whole class room is covered with paper.

ITRI’s role in Taiwan

Industrial transformation, that’s what ITRI is all about. The institute is a nonprofit R&D organization engaging in applied research and technical services. Since its foundation in 1973, ITRI grew to one of the world’s leading technology R&D institutions. It has played a vital role in transforming Taiwan’s economy from a labor-intensive industry to a high-tech industry. “ITRI’s mission is to assist Taiwan’s industrial development,” says Pei-Zen Chang, the executive vice president who is responsible for introducing system thinking at ITRI. “It has been mandated not only to provide assistance in technological development, but also to assist in industrial transformation and development.”


The 2018 System Architecture class with Pei-Zen Chang, the executive vice president of ITRI (sitting 2nd from left) and trainer Ger Schoeber (sitting in the middle).

Taiwan and The Netherlands are similar in size and population. Both countries know: if you are small, you have to be smart. Just like the Dutch, the Taiwanese people have relied on their determination and perseverance in search of optimal economic development models to compete with their larger and stronger neighbors. In this continuous race, the drive to excel in technology has always been a major force for Taiwan, and ITRI is playing a crucial role there. Even an imperative role, says Chang: “In the age of fierce international competition, Taiwan’s industrial structure remained predominantly small. We have a lot of medium-sized enterprises that rely on our innovations.”

The Taiwanese institute has been quite successful since its foundation. It helped incubate over 270 companies, including famous examples like UMC and TSMC. Its 6100 employees – over 80 percent of them hold advanced degrees – produce over a thousand patents annually (accumulated total number of over 27,000). Chang’s message is that ITRI has to continuously help the Taiwanese industry to transform and upgrade –  a role comparable to that of TNO in The Netherlands and Fraunhofer in Germany.

One example is ITRI’s involvement in the fiercely competitive machine tool industry. The R&D-institute developed the controllers that helped Taiwanese manufacturers to upgrade their products, become world-class and rival their German and Japanese competitors. Taiwan is a top-5 player in machine tools, on par with China. This market continues to be challenging, says Chang. “With the support of our Ministry of Economic Affairs, we have established a smart manufacturing demo line for product equipment performance verification and system commissioning in the field. This will keep us up with Industry 4.0 and such facilities are expected to gradually strengthen the entire system’s capabilities.’

 

Smart logistics

Logistics is another example where systems thinking helped ITRI to work with industry on innovative solutions. The institute helped to introduce RFID, automation systems, smart pick-up station, and many more logistics technologies in Taiwan. Chang: “In the logistics industry there are many ways to get things delivered quickly. System engineering analysis enabled us to better understand the needs of the industry. It was evident to us that the identification and classification of various and voluminous items are key factors. Along the way ITRI helped steer Taiwan’s logistics and e-commerce companies towards smart logistics and services.”

'Research provides greater value when the development of technologies, components and modules is based on the needs of the industry'

Chang points out that value creation is a prime focus for ITRI. “Research provides greater value when the development of technologies, components and modules is based on the needs of the industry.”

That’s where system architecture comes in. Over the last couple of years ITRI invited industry veterans and system innovation experts to Taiwan to give lectures on system architecting. The institute wanted to infuse stronger system architecture thinking into its managers of various cross-disciplinary projects.

The goal was to establish a common language for the project leaders in different fields. Although the focus technologies in ITRI’s focus fields ‘smart living’, ‘quality health’ and ‘sustainable environment’ can lie apart, the Taiwanese were convinced that a shared language among various labs and fields would strengthen innovative R&D cooperation.

Part of ITRI’s strategy to introduce system architecture thinking was High Tech Institute’s system architecting training, a five-day intensive course. Apart from theory, participants spend most of the training working on case studies with in-depth discussion and learning exercises. “Our goal is to gradually build up the system architecture thinking,” says Chang. As a common language the participating students learn to think and talk according to the so-called CAFCR model.

CAFCR is all about moving into the customer’s shoes. It forces system developers to not only look at the technology. The letters C, A, F, C and R represent five viewpoints to look at system architecture. Only two of them are about technology. Three are about the customer’s perspective and that is where the greatest value of the CAFCR framework lies. Most important is the first C, that stands for ‘customer objectives’.

“It is all about the customer,” explains trainer Ger Schoeber. “What exactly is their business? How do they earn their money? What is the living environment of the customer or the colleague who is going to install my subsystem? If you really understand the customer, you will see much clearer what it is that they need in order to do better business. CAFCR forces you to look not only at the technology, but also at the specification and the rationale of the requirements. It allows you to come up with solutions that help customers even more.”

'We keep strengthening the interdisciplinary competence of our labs and nurturing the professional talents for system integration'

ITRI’s senior executives and various project leaders all attended the entire Sysarch course in 2017. A survey among participants showed that especially the CAFCR model did help project leaders to systematically promote and implement the projects they are responsible for. That convinced ITRI to continue with Sysarch in 2018. Chang: “To keep strengthening the interdisciplinary competence of our labs and nurturing the professional talents for system integration.”

Participants valued the extensive experiences that Schoeber has in system architecting in industry and system innovation, Pei-Zen Chang points out. “Ger talked about the role of the System Architect and its importance in operating a company, and introduced System Architecting with detailed explanations, procedures, key drivers and CAFCR models. Ger has also deepened participants’ understanding of the course content through role-play exercise. Using a hypothetical situation of ‘proposing equipment solutions for bedridden senior people’ over the five day course, he divided the class into four groups, and asked each to present their solutions to a company’s senior executive or angel investors.”


Case studies with in-depth discussion form a large part of the Sysarch training.

This proved an effective way to ensure the participants gained a thorough understanding from the course. “In addition, drawing from his rich practical experiences, Ger provided guidance to each group, so that the participants could correctly use the content and methods learned from the class”, says Chang. With the above-mentioned systematic and professional curriculum and guidance, the course scored 4.97 points out of 5 in July 2018. “A very high satisfaction rating”, smiles Chang.

Program directors that are designated to lead a cross-disciplinary project and have followed the Sysarch course will take the next step in system architecting at ITRI. Chang: “From now on they will effectively implement their newly acquired planning and maintenance skills, and share their experiences and knowledge with colleagues throughout the institute.”

Once the entire ITRI has been ingrained with this market and customer demands oriented mindset, such system thinking and experiences will be disseminated to Taiwan’s industrial sector. “In this way they will help accelerating its transformation and upgrading to create new value,” says Chang.

Close relationship

In the past fifty years, key industries from Taiwan and the Netherlands have forged close relationships. Both countries have been able to carve out unique industrial advantages and flourish internationally. Chang points to the Philips TV factories that were set up decades ago in Taiwan. “This helped upgrade our nation’s production knowhow and cultivate our talents,” he says.


During Sysarch 2018 Ger Schoeber discusses the case ‘equipment solutions for bedridden senior people’ with division director Keh-Ching Huang, who is one of the participants.

In the 1980s Philips bankrolled the creation of TSMC. As the world’s largest semiconductor foundry, TSMC is now one of ASML’s  main customers. Recently ASML acquired Taiwanese Hermes Microvision, a specialist in metrology solutions for chip production.

Chang sees a bright future for the relationship between both countries. He points to the ‘5+2 Industry Innovation Plan’ that the Taiwan government has been promoting in recent years. “This plan encompasses smart machinery, Asian Silicon Valley, green energy technology, biomedical industry, national defense industry, new agriculture, and circular economy,” elaborates Chang. “We reckon that international cooperation is one of the most important means to implement programs as such, and there is no doubt the Netherlands will be an ideal partner for us to work with, given the country’s deep experiences in system development and solid foundation with respect to semiconductor, agriculture, circular economy, green energy, precision machinery and so on.”

ITRI underlined this by opening a physical presence at the High Tech Campus in Eindhoven. This office actively promotes scientific cooperation with the Netherlands. Chang: “Since ITRI is expected to take on some of the responsibility for implementing Taiwan’s 5+2 Industrial Innovation Plan, I will spare no efforts to help strengthen the cooperation momentum with the Netherlands, in order to create for both countries high-value technology industries with blue ocean benefits.”

This article is written by René Raaijmakers, tech editor of Bits&Chips.

Recommendation by former participants

By the end of the training participants are asked to fill out an evaluation form. To the question: 'Would you recommend this training to others?' they responded with a 8.5 out of 10.

Do not confuse being able to hack with knowing the art of writing secure code

Secure coding in C and C++ training
You can turn writing good software security into a good habit. Something you barely stop to think about, like brushing your teeth or putting on your seat belt. High Tech Institute is working with the specialists at Hungary’s Cydrill to teach you how.
‘We teach developers how not to code.’ László Drajkó likes unsettling his conversational partners with this bold statement. Yet that’s what the software security courses taught by his company, Cydrill, revolve around: teaching coders the professional discipline to prevent weak spots in their software.

Perhaps ‘discipline’ is too strong a word. Drajkó thinks it’s more comparable to putting on your seat belt. ‘You no longer notice you’re putting it on. In the same way, you can teach yourself the good habit of writing good software security. And then you’ll automatically avoid pitfalls, without stopping to think about it. We teach people to instinctively use good coding habits.’ Secure coding doesn’t take more time, Drajkó says. ‘It takes time to learn how, but once you have there’s no difference.’

Teaching people to write secure code is a labour-intensive endeavour. Break-ins are continually happening around the world, exposing vulnerabilities. It takes a sizeable team to keep up with all that information and work it into training material as case studies. ‘An independent teacher would spend four hours staying up to date and incorporating new material for every hour of class,’ Drajkó estimates.

That’s why High Tech Institute is partnering with Cydrill, a specialist fully focused on training people to write secure code. The Hungarian company is especially focussed on security for embedded systems.


‘We aren’t selling painkillers and band-aids, but building an immune system that’s extremely resilient,’ say Ernõ Jeges (left) and László Drajkó (right), who visited the High Tech Campus in Eindhoven last summer.

The Commodore 64 and the ZX Spectrum

Cydrill is located in Hungary’s capital, Budapest. In the eighties young László Drajkó had access to computers, though within the Russian sphere of influence that access was very limited. His first acquaintance came when he was twelve. ‘Science was non-political. The educational system was highly theoretical, but quite good. Behind the Iron Curtain, we had to rely on our brains and we had few other resources.’

Drajkó and his fellow students wrote their code on paper. ‘We ran it in our heads. We checked for coding mistakes that had never been implemented. We made our corrections on paper, too. Because when we finally had access to a machine, we wanted to feed it error-free programs. We barely had money or computers.’

In the mid-eighties the Hungarian coders were permitted to travel to Germany and Austria, where they were able to buy Commodore 64s and ZX Spectrums. ‘The generation before ours had to shell out millions of dollars for a computer, but suddenly we could buy a home computer for five hundred dollars. The PC had a major impact on our age group.’

In the mid-eighties Drajkó was studying computer science in Hungary. The Iron Curtain fell while he was still in college, which had a huge impact on him. A European Economic Community grant enabled him to attend the Delft University of Technology. The result was culture shock. His first few months in the Netherlands immersed him in ‘total miscommunication’.

Though he spoke English, Drajkó didn’t understand his advisors’ questions. ‘Not in terms of language, but conceptually. The educational approach was completely different. They’d ask things like, ‘László, what problem would you like to work on?’ And I’d say, ‘No, no, I don’t have any problems. Just tell me what code you want me to write and I’ll find the best algorithm for it.’ But then they said things like, ‘How would you like to change the world for the better?’ And I thought, ‘I’ve wound up in art school!’’


‘When I went to college in Delft, I thought I’d wound up in art school’ – László Drajkó on the culture shock he experienced as a Hungarian university student in the Netherlands.

Novell, Compaq and Microsoft

After twenty-five years working for international companies such as Novell, Compaq and Microsoft, Drajkó decided to invest in a training company. He wanted to share what he knew and was looking for a suitable niche. He found it in security. ‘I asked myself what was going wrong and one of the answers was cybersecurity.’

Some time ago Drajkó ran into two familiar faces, Zoltán Hornák and Ernõ Jeges. All three studied at the Budapest University of Technology, but Hornák and Jeges have known each other since 1990. That year, the Hungarian and the Serbian competed against each other in the second International Olympiad in Informatics in Minsk. A few years later, Jeges decided to study computer science in Budapest.

Hornák and Jeges became fast friends and during their doctoral research they conducted tests for Nokia, breaking into mobile telephones and networked systems. Demand was so high they abandoned their PhDs and started hacking systems on assignment. ‘White hat hacking was uncharted territory back then,’ Jeges says. ‘Very few companies were doing it. Nokia had a ton of assignments, and we realized we were learning more on the job than we were at the university.’

The penetration testing (pentesting) assignments poured in to their company, Search Lab: the pair were hired to break into networking hardware, set-top boxes and more. Most of the target systems were embedded. ‘Not many security companies focus on those, because you need to understand the system at the chip level. Most pentesting companies focus on websites and web services, but we explicitly specialize in embedded.’

The 2008 crisis hit Search Lab hard. In that same period, the mobile phone industry switched entirely to the Iphone and Android platforms. Hornák and Jeges lost most of their business from clients with whom they had a long history.

Their shared focus on security sparked the click with Drajkó. ‘The number of incidents is growing exponentially, while awareness is minimal,’ he explains. ‘Only a handful of companies are doing something about it. Everyone’s busy patching errors, but that doesn’t address the problem. Education is the golden opportunity to prevent a software security crisis. Our stance is that we aren’t selling painkillers and band-aids, but building an immune system that’s extremely resilient.’


Ernõ Jeges’s goal is not to teach people how to hack, but to instil paranoia.

Instil paranoia

In 2018 Drajkó and Jeges founded Cydrill, the company that focusses on trainings. The security industry is in constant motion and to keep up with it, Cydrill offers online training in addition to traditional classroom fare. For a modest annual fee, participants can shore up their knowledge using a digital gamification platform. The online approach also makes it easy to track results. ‘We measure our success by the way clients translate our expertise into coding habits,’ Drajkó says.

'If you ask developers to choose a course from nineteen different options, security will probably come in at the bottom.'

The need for inherently secure code is high, but not all developers are enthusiastic about security classes. ‘If you ask developers to choose a course from nineteen different options, security will probably come in at the bottom. It sounds very prescriptive. A new platform, new language or new architecture is much more appealing to them.’

Cydrill’s software security courses don’t teach developers how to hack. There are plenty of classes that do that, Drajkó says. Many of his clients in the US have experience with them. ‘But they’ve been turned off by them, because the course designers couldn’t relate it to their daily work.’

Drajkó believes that learning hacking techniques in order to prevent hacks is a waste of time. ‘It doesn’t matter whether it’s ethical hacking or hacking with bad intent. Because in terms of technology there’s no difference; it’s a question of morality.’

Drajkó believes that developers do need to be well versed in what exactly hacking is. ‘That’s why we address it. Participants also need to understand that hackers have infinite time and infinite resources. They make use of bots and third-party computers. In the embedded domain that use is growing in lockstep with the Internet of Things.’

That’s why Cydrill’s courses always start with a peek inside the hacker’s mind. ‘For example, we show them that a buffer overflow can be a problem,’ Jeges says. ‘That someone can take control that way, and it will no longer be your program that’s running.’

Jeges’s goal is not to teach people how to hack, but to instil paranoia. ‘The first day, participants go home feeling uneasy. They realize they’ve made mistakes in the past. That feeling is important. It has an impact we can’t achieve through online training.’ After that experience participants are all ears, Jeges notes with a smile. ‘Emotionally and intellectually.’

'They can apply the new techniques and skills they learn the next day.'

That makes the class ripe for covering best practices. ‘We show them the difference between well-meant attempts to make code hack-proof and actual best practices,’ Jeges says. ‘They can apply the new techniques and skills they learn the next day.’

Case studies are an important component of those best practices. ‘We use every incident that’s been global news,’ Jeges explains.

This article is written by René Raaijmakers, tech editor of Bits&Chips.

Recommendation by former participants

By the end of the training participants are asked to fill out an evaluation form. To the question: 'Would you recommend this training to others?' they responded with a 9.3 out of 10.

“If it works, managers think it’s pretty much done.”

Software development often focuses entirely on filling in functionality; there is no time for issues such as maintainability, architecture and performance. While it doesn’t need to be difficult, says software trainer Onno van Roosmalen, but there are misunderstandings about Agile, architecture, UML, object-oriented programming and test-driven development.

Modeling? We don’t do that. Design-patterns? Let’s skip it. Because we work agile. Onno van Roosmalen hears it regularly: Agile as an excuse, or even  a pretext not to take software architecture seriously. As a trainer in the field of software development, he sees that there are many misunderstandings around the issues related to non-functional requirements: architecture, interfaces, performance, modeling, maintainability, you name it.

Misunderstandings that can be easily explained because they have to do with the elusive concept of ‘software quality’. “Quality is not visible to many people,” says Van Roosmalen. “If it works, then it works and managers, customers and users think that it is pretty much done. It becomes  very difficult to argue why you have to do something extra. Developers get a clear return on that in practice, however.”

Many of these misunderstandings about software quality also live among the developers themselves. Popular software techniques, like test-driven development, do not contribute to this, he thinks: “Test-driven development completely focuses on functionality. Aspects that are typically linked to architecture, such as performance, reusability, extensibility and software evolution, are very difficult to test. Just like race conditions and deadlocks.”

“There is also the idea that architecture is something abstract that has to be thought up in advance, which then forces the direction of the project into a straitjacket – a big bang architecture that has to be right the first time. But you can’t do that; often you don’t know what’s coming next,” explains Van Roosmalen. “Of course, it’s good to have an idea of how you want to arrange it. But you sometimes see that projects are already preparing themselves for certain additions. Then I often think: Yes, and will they truly come?”

'You really can make remarkably better software if you apply the guidelines properly.'

In addition, the underlying theory dusts over time. He notices this well, for example, in his basic training on object-oriented analysis and design (OOAD). He comes across participants with an electrical engineering background, for example, but also people who earned   an IT degree. “Yet, you see that many of them dissect problems procedurally instead of object-oriented,” says Van Roosmalen. “That’s what creeps in when people make software under pressure, while, really, you can make remarkably better software if you apply the guidelines properly.”

‘It’s not like that In C#’

Van Roosmalen himself has an entirely different background: he studied physics in Nijmegen and obtained his PhD at the Kernfysische Versneller Instituut in Groningen. He then made the jump to America for a postdoc position at the California Institute of Technology.

The turning point came in 1987. Van Roosmalen and his wife actually wanted to return to the Netherlands, but jobs in physics were not up for grabs. When a position in technical computer science became available at Eindhoven University of Technology, Van Roosmalen decided to take advantage of this opportunity; his work in the field of computational physics had aroused his interest in software and computers.

Moreover, Van Roosmalen noticed that he enjoyed teaching. Before his return to the Netherlands, he had already been teaching at Yale for three years, and when object-oriented techniques emerged in the early nineties, he started training for companies. After the turn of the millennium, he decided to reduce his employment at the TUE  to, ultimately, fully focus on training. He still works for the Eindhoven University, which hires him for the PDEng course in Software Technology.

'The longer developers have spent behind the keyboard, the more receptive they generally become to advanced software engineering concepts.'

So, moreover, Van Roosmalen  provides training for developers who have been employed by a company for a couple of years. A big difference with starters, he notices: the longer developers have spent behind the keyboard, the more receptive they generally become to advanced software engineering concepts. “You start looking at things in a different manner and I think you can see things more in context. It sticks a little bit better.”

“In most object-oriented programming languages, objects of the same class can, for example, directly access each other’s private attributes. People often don’t know that at all. Then they give it a try, and it turns out to be true.  They often say something like: ‘But it’s not like that in C#.’ And then again it turns out to be exactly the same there,” Van Roosmalen gives as an example. “It seems elementary, but there are important ideas behind it, and most developers like to talk about that again. Most training courses are about the process and all that, and not about the technical stuff.”

Defensive

”It is precisely this knowledge of object orientation that forms the basis for much of the architecture in a typical application. This means that you are automatically confronted with important software properties such as information hiding and encapsulation, i.e. the idea that you localize information and not throw it through the entire system. In practice, you  regularly see that a team provides one component with extra functionality and then the entire system starts to  topple like dominoes. That makes it problematic to add something. In many web applications you see that that idea is slightly broken,” says Van Roosmalen.

And then, thinking about hiding information goes hand in hand with the means by which components communicate with each other: the interface or API. “ It enables you to hide the detailed shape of your objects and ensure that no implementation details leak out. You  thus take care that the client code can only do what is currently requested, no more and no less,” explains Van Roosmalen.

The idea behind this is that the evolution of components can be decoupled in this way. If a new version of a component continues to do what it used to do via an interface, the software built on it does not have to be modified immediately. A development team that programs against the component can safely use the new version without being afraid that something will fall over in the process. When encapsulation and interfaces are in order, a maintainable, scalable architecture is created almost automatically that can grow with the application.

'The more you offer, the more unintended usages there are.'

This requires, however, that teams adopt a defensive stance when designing their interfaces. “You shouldn’t just offer everything that other teams demand. The more you offer, the more unintended usages there are. This increases the chance that things will fall over with a new version. You can always expand an interface later on, but downsizing is a lot more difficult,” says Van Roosmalen.

“I have a very nice workshop for that, which I do with the TUE trainees. Several teams are given the task of developing a component with an interface. After that,  they have to make a functional extension to it – unexpectedly of course; I keep that secret. Then they have to try to make test cases for each other’s components that work against the first variant, but no longer against the second. These are real eye-openers, because often such tests are made in no time.

Box of tricks

Developers don’t have to reinvent the wheel every time. For many problems, best practices have been established over the years, in the form of patterns. “For example, you can lay out your architecture in layers: that’s an architecture pattern. To  structure those layers, you use  various design-patterns.”

Van Roosmalen provides a training course entirely dedicated to this subject: Design patterns and emergent architecture. That’s very broad of course, but the idea behind it is mainly to show that you have that box of tricks. “Actually, that’s one of my favorite training courses, because you really talk about software design and because you can use it to tackle all kinds of practical problems. There are also very different technical aspects involved, such as state machines with possible deadlocks.”

Van Roosmalen, together with a former TUE colleague, also provides a follow-up course on a different architecture theme: real-time behavior. Typically, you’re talking about problems with systems that have to perform several concurrent tasks  subjected to timing requirements. Nowadays, most modern programming languages allow you to program concurrency , and that gives rise to very  special problems such as race conditions and deadlocks.

Here, too, things  already go wrong in the basics, notes Van Roosmalen: “A lot of people who have real-time problems use an operating system like Microsoft Windows. Well, that’s not a real-time operating system. It does contain a lot of things  like real-time priorities and so on. But many other things are missing that are also necessary for real-time behavior. Then you have to stand on your head a bit to get it right.”

“In real-time systems, for example, you have the problem of priority inversion, where a low-priority task claims a resource so that higher-priority tasks cannot make use of it. There are mechanisms to keep this to a minimum, and they must be in the operating system.”

Vendor lock-in

Van Roosmalen also provides a basic training around SysML, a variant of UML for systems engineering. System engineers model a lot, yet SysML is only used sparsely in practice. There’s a reason for this: “A lot of commercially available tools, such as Matlab and Simulink, are used for systems engineering. These  do not deliver standardized models. That is not what  tool vendors want  at all anyway, because they might lose business. They play on vendor lock-in.”

But with SysML you can  yet integrate these models with each other and make an overarching model of your system. The OMG, the standardization group behind SysML, has tried to combine these modeling techniques in such a way that the whole  covers everything and at the same time  Is methodologically sound. “That worked out pretty well, but  it makes the modeling languages awfully big.”

As far as Van Roosmalen is concerned, software developers should likewise take modeling a bit more seriously. During his courses, he relies heavily on UML himself – partly because a training course is too short to go into extensive programming, although programs are provided as proof of concept. But also, because it offers good starting points to reason about the class structure and to think about the architecture.

“As for UML, most software developers have seen it once before, but the threshold to really get started with it is quite high. Making a good model really does require an investment before it pays off ,” Van Roosmalen agrees. In addition, software developers also lack a bit of the discipline that system developers do have, he observes.

'The Agile Manifesto simply states that you need to pay attention to software quality.'

But that again is a symptom of the fact that quality in software is hard to see and only makes itself felt in the long run, whereas that’s naturally different  with systems engineering. “For programmers it is therefore important to put quality on the map. And contrary to what is sometimes thought, Agile can help with that,” says Van Roosmalen: “The Agile Manifesto simply states that you need to pay attention to software quality. It states very sensible things about that, however you have to practice what is preached there.”

This article is written by Pieter Edelman, tech editor of Bits&Chips.

Recommendation by former participants

By the end of the training participants are asked to fill out an evaluation form. To the question: 'Would you recommend this training to others?' they responded with a 8.8 out of 10.

This training course provides you with a jump start in the Embedded Linux world

Jasper Nuyens was there at the cradle of Embedded Linux and developed the first Embedded Linux training in the world in 2005 in collaboration with Mind in Leuven (now Essensium). High Tech Institute has been offering the training in Embedded Linux in the Netherlands for a number of years, on an exclusive basis. The Belgian Linux pioneer tells us the ins and outs of his five-day course.


Jasper Nuyens, founder of Linux Belgium and Embedded Linux trainer at High Tech Institute.

Just as Jasper Nuyens went deep into Linux in the mid-nineties, there was an initiative to reduce the footprint of the system. This ‘Linux Router Project’ was an effort to run a complete Linux-based router from a 1.4 megabyte floppy disk. Nuyens: ‘This was done in order to use old PCs with a number of network cards as a server or router. It was a big challenge in which the Linux kernel compilation played a very important part. The tricks we had to pull out of our sleeves to make this work were actually a beginning of embedded Linux: a small system where you can add many applications.’ The project lives on in current router projects such as OpenWRT and DD-WRT.

Due to his extensive experience, organisations call in Nuyens’ help ‘if there are really difficult problems.’ This ensures that he can delve more deeply into very special cases. Nuyens: ‘That makes the job very interesting. It also ensures that our course stays up to date.’

'We now had about a hundred editions, whilst we are on version 65 of the course.'

Nuyens always adapts his training sessions to the latest developments. ‘We now had about a hundred editions, whilst we are on version 65 of the course.’ He cites an example of the embedded build-system Buildroot and OpenEmbedded / Yocto that became available in the new millennium. ‘We also include that in our training. We always adjust the material to recent developments.’


The biggest Embedded Linux problems land on Jasper Nuyens desk.

What is it that makes embedded Linux training so successful?
‘With Linux you have the source code of all the components, from the boot loader, kernel to the tens of thousands of applications that are available in user space. You can use everything and have a lot of choices. This means that you have to learn to see the wood through the trees. There are many possibilities. Many application behave a little different, but you can refine them. That is actually the core of our course: helping people find their way and make the best choices.’

You can also learn everything for yourself on Internet.
‘Of course, it is not really rocket science. You can learn individual topics online. The embedded Linux course, however, brings all things together and provides the whole picture, from the ground up. Even people who have been immersing themselves in Embedded Linux for a long time do not always see how things fit together or how each of the different components interacts with the others. The training provides you with a jump start, a push in the embedded Linux world.’

Linux is not a real-time operating system. To what extent does this play a role in the choice for this OS for embedded systems?
‘When people start using Embedded Linux, they usually think that real-time is an important requirement for their system. They often work with microcontrollers at a low clock speed. The timing restrictions can then impose real-time behaviour. Effectively, however, they often don’t need it. The current Systems On Chip sometimes run Linux with a gigahertz or more. As a result, real-time use of Linux is rarely required in practice.’
‘There are also different levels with which you can work in real time on Linux. We go into that in our training. With the standard Linux kernel you have soft real-time possibilities. But you can also turn Linux a hard real-time operating system and there are also intermediate forms. The students learn to make these choices depending on their needs.’

To what extent do you go into the legal aspects of Open Source?
‘We do not provide legal training, but we do make it clear what each of the different Open Source licenses mean and what their results are, including the patenting of software and hardware and trademark issues. Many software developers do not know enough about the consequences of having chosen Open Source. They need to be able to keep their management well informed about this and the few requirements imposed by Open Source licenses.’
‘Some are so enthusiastic about the Open Source story that they do not want to say anything negative about it. But companies that use and implement open source software need to know exactly what they need to do. They also need to have objective information about the obligations and risks.’

Participants should preferably have experience with Linux and C programming experience, but is it essential?
‘That’s right. It is nice to have it. We receive two types of students. We have those who have a lot of experience in embedded, but have limited Linux knowledge. And we have those with Linux experience, even with a lot of Embedded Linux experience, who want to deepen their knowledge.’
‘There is a very large difference in level between both profiles. The first group has hardly any experience on the command line. That is a big step for them. On their laptop they have a Linux ‘command prompt’ and on the embedded board they also have ‘command prompt.’ They cannot mix them up. If your embedded board enters the boot in boot mode, in the boot loader, then that is another prompt where you can type different kind of commands. The same occurs: you cannot mix them up.’
‘People from the embedded world without Linux experience usually find it tougher. We find that both groups help each other. We start the course with information that is fairly basic for people who have already worked with Linux. However, it offers the necessary depth to discover new things, so that the more experienced embedded Linux people do not get bored. It is a case of balancing, but the group is sufficiently small to deal with that properly and answer questions from each participant.’

Eight participants is the maximum number.
‘Yes, eight is our magic number, as we have gradually discovered. Because during the training everyone asks questions from their own perspective. The loose ends that they cannot immediately link to their existing knowledge. That is why we need a degree of interactivity. Students program their platform during the training and there are more advances optional exercises for those who are faster. It works well when they can immediately ask their questions and I can immediately look into them.’
‘Linux also has a very steep learning curve. In the beginning, you have to assimilate a lot of information. We want to help everyone move forward to really make a real jump start.’

Is that always possible?
‘We are actually doing pretty well there. The internal operation of the Linux kernel is the toughest part. That is not about how the programs work on the kernel, but about the kernel internally. That is a difficult part for people who already find the material tough. It adds an extra layer of complexity. If we notice that the whole group is slower, we will spend less time on it.’
‘We also offer a separate course for kernel development. People who have to build device drivers can follow the kernel development course after this training. Following the Embedded Linux course first, is a requirement unless people are working with Linux at least 5 years.’

Which people are actually too early?
‘Those who have only been working only with microcontrollers and have little knowledge of Linux. We make sure that they can work on the command line, that they can upgrade their board to a new software version that they have fully compiled themselves and that they can boot from the network, using the network file system as a root filesystem, and so on. That works without prior knowledge by letting them do a number of exercises, but all participants are helped with this if they have – in advance – learned working with the Linux command line.’

Will more experienced people get their money’s worth?
‘Absolutely. People with experience in Linux or embedded Linux won’t get bored. The more they know in advance, the more they will get out of it. We provide so much in-depth information and background information that they will always learn. We go wide and very deep.’


In his Embedded Linux training, participants start working with a Beaglebone Black platform.

'The entire BeagleBone design, the complete PCB layout with all its variants can be completely reused by the customers.'

This is a print with a Sitara system on chip from Texas Instruments. This American chip manufacturer founded the non-profit BeagleBone Foundation to provide Linux support for these platforms. ‘It is primarily a showcase for the Sitara platform,’ says Nuyens. ‘But it also gives developers a practical step forwards. Everyone can play around with the technology for free. ‘The entire BeagleBone design, the complete PCB layout with all its variants can be completely reused by the customers. By making minor changes to the reference design you can speed up the roll-out of new products by reducing possibly much long work on the software side.’

If desired, Nuyens also has other boards for the course available. It is possible to run the training on Freescale’s i.MX 6 platform (nowadays NXP). ‘This is also a popular platform in the Linux world. The i.MX6 family has single, double and quad core variants. The latter are more powerful for multimedia applications.’ Other variants on which the embedded Linux training can take place are the ZedBoard from Avnet and Atmel’s AVR32 platform. Training on these platforms by Nuyens usually runs on specific requests, typically delivered in-house at customer locations.

Also read the interview with Jasper Nuyens in Bits&Chips.

This article is written by René Raaijmakers, tech editor of Bits&Chips.

Recommendation by former participants

By the end of the training participants are asked to fill out an evaluation form. To the question: 'Would you recommend this training to others?' they responded with a 8.6 out of 10.