The state of C++

Despite a host of up-and-coming alternatives, C++ is still a force to be reckoned with, certainly in the legacy-fraught high-tech industry. In a series of articles, High Tech Institute trainer Kris van Rens puts the language in a modern perspective. In our new 4-day training course, Kris van Rens introduces participants to the language basics and essential best practices.

Last July, the Carbon programming language was officially announced at the CppNorth C++ conference in Toronto, Canada. Carbon is presented as “an experimental successor to C++” and was started as an open-source project, by Google no less. Wait… Google is going to create a C++ successor? Until recently, the company was heavily involved in developing the C++ language and engineering the Clang C++ front-end for the LLVM compiler. With tens of thousands of engineers within Google working on billions of lines of code, choosing the path of a completely new language seems rather bold.

Why would a huge company such as Google venture into such a daring project? Well, it’s a symptom of the state and development of C++. For those who haven’t caught up with the language’s evolution in the past few years: there have been some major discussions. Of course, having discussions is the whole point of the C++ committee meetings, but one topic has been popping up again and again without settlement: whether or not it’s worth improving language design at the cost of backward compatibility.

Leaner governance

C++ has been around for about forty years now and is being used to create performance-critical software all over the world. After a period of relative quiet following the initial ISO standardization in 1998, the committee managed to steadily introduce great improvements every three years since 2011. As a result, the language has grown quite different from what those of us old enough used to work with in the nineties and noughties. The addition of features like concepts, ranges and modules in C++20 alone pack a powerful punch.

At the same time, though, the C++ language evolution process is known to be extremely challenging. The weight of carrying decades of technical debt while maintaining backward compatibility is substantial – too much for some, it seems. Trying to add a significant language feature may cost up to ten years of lobbying, discussions, reviews, testing, more reviews and meticulous wording. Of course, introducing considerable changes in a project with this many stakeholders is no mean feat, but ten years in today’s tech world is a literal lifetime. Another challenge is that the ISO committee is predominantly Western, with a heavy underrepresentation of big Asian C++ users like India or China. These downsides don’t look good, especially not in the light of rapidly-growing, modern, openly governed (and relatively young) languages like Rust or Swift.

Sigasi Extension for Visual Studio Code

Sigasi announces the release of thei VS Code Extension with rich support for SystemVerilog, Verilog, and VHDL. Our extension provides features and language support such as code navigation, project management, linting, code formatting, tooltips, outline, autocomplete, hover, and much more!

''Still, I think right now is a very important time for C++ to consider its position in the systems programming universe; it can’t ignore the signals any longer.''

Is the technical debt of the C++ language really of such gargantuan proportions that it’s next to impossible to add new high-impact features? One-man army Sean Baxter of the Circle C++ compiler has shown that it’s not. In the past months alone, he single-handedly demonstrated that it’s possible to add considerable features like a true sum type and language-level tuples. Granted, an implementation in a single compiler of a C++ dialect without a thoroughly reviewed proposal is far from an official C++ language feature, but at least it shows how much wiggle room and opportunity there is in the syntax and language as a whole – if we really set our minds to it. It also shows that the burden of technical debt alone isn’t the limiting factor in the language development.

The C++ language governance model isn’t likely going to change anytime soon, being so tied in with the ISO process and the committee stakeholders. Still, I think right now is a very important time for the language to consider its position in the systems programming universe; it can’t ignore the signals any longer. Perhaps a leaner governance structure will help, or allowing for breaking changes to shed technical debt in a future version – who knows. Unfortunately, such substantial changes to the process will most likely take years as well.

Wait and see

Will the drawbacks cause C++ to be eliminated anytime soon? No, definitely not. The sheer momentum of the existing code and user base is overwhelming. ‘Just’ switching to another language isn’t an option for everyone, not even for Google. For that to work out, true interoperability with C++ (not just C) is needed, which is where alternatives like Rust and Swift still fall short. Not for nothing is Google advertising C++ interoperability as a key feature of Carbon, allowing to step-by-step adopt the language from a large existing C++ code base.

At the moment, however, Carbon isn’t much more than a rough specification and announcement. We’ll have to wait and see if it can live up to the expectations. In the meantime, C++ will evolve as well, hopefully positively inspired by the possibilities of Circle and other languages in the field.

 

System architecting for politicians

System architect

Over there, under the parasol, cap, sunglasses, beer, that must be our prime minister.
If I arrange another beer, can I join you?

Beer is welcome and if you don’t talk politics, you can join us.
System architect in politics
Illustration Rutte with Luud Engels

Deal! I am a political illiterate. I give a training here and can only talk a little about high-tech system architecting.
 

Sounds interesting! I’ve been on quite a few trade missions and I know the Netherlands plays a leading role there.

 
It certainly does! I’ve had the opportunity to work for companies that could predict which high-tech product they needed to have on the market in three years and put genius researchers and supremely capable engineers to work to reach that goal.

Precisely because it takes a considerable number of different areas of expertise to develop, manufacture and maintain such a high-tech product, a tangle of conflicting requirements arises from that multitude of disciplines. But the successful companies stand out because despite this tangle, they can agree on an approach and thus make the right decisions in a timely manner.
 

That must indeed be enormously complex. But fortunately, those bright minds and handy hands know which calculations and models to apply. In my work we also apply models, but they are more fodder for discussion than leading to consensus and correct decisions. With us, it’s more human work.

 
There may be more similarities there than you would think. All experts in high-tech are lords and masters in their fields and often take the stage to showcase exactly that: beta superiority.

On the one hand, you desperately need the expertise, models and calculations to keep those professionals innovating in their fields, digging deeper and deeper tunnels. And on the other hand, every new insight in a certain discipline is used as a weapon to beat the brains out of experts from other tunnels.

Islands arise, sometimes even camps, and the plague is that they all have a valid point.
 

Okay, okay, so it’s human work too. But you said just now that they do come to an agreement. So how do they do that?

 
It’s all about system architecting. They reach working agreements – you can call it an approach – in which the various disciplines provide each other with insight into where, in essence, the contradiction is manifesting itself and for which parameters a balanced solution must be found. So this is not about negotiating or trying to reach consensus, but making jointly weighted choices. Once they all have an overview and agree on the entire system, these bright minds subordinate their own tunnel wisdom to, say, the higher good.


 

Nice that that’s how it works in high-tech, but how different it is with us. No doubt you have seen debates where people are too busy proclaiming their own party truth and unwilling to listen to each other, let alone understand each other. That system architecting doesn’t work with us.

 
I’m going to play devil’s advocate; those debates do not have the common purpose that does prevail within successful companies. In the debates, the system goal is conspicuous by its absence.
 

No, it can’t be because of that. For example, we have set a very clear goal for nitrogen reduction: half less by 2030. So how concrete do you want the goal to be? 

 
Here you touch on a basic error. You see, that reduction is not a system goal. This is exactly where constructive companies differ from politics. Let me explain.

The system goal will include terms such as food quantity, food quality, sustainable operations and conservation of environment. However, no system has ever been developed with the goal of reducing nitrogen, which is exactly why many protest as soon as you do set that as a goal. Don’t get me wrong, I’m no climate wimp. I see the excess nitrogen deposition as a negative effect that needs to be fixed.

I am pretty sure that farmers, citizens and businesses subscribe to the system goal of producing food in a sustainable way in the Netherlands. If you had invited them to keep heading towards that system goal while the nitrogen surplus needs to be repaired, you would have got cooperative thinkers instead of counter-thinkers. The system goal always involves a desired effect, and most people therefore want to participate in it.
 

I see your point. So the Netherlands can be governed by system architects?

 
Govern not, but even politicians would benefit from practices and methods such as those within system architecting:

''Proclaiming system goals results in solution supporters.''

''Proclaiming solutions results in aimless opponents.''

 

So we messed up?

 
In this line of approach, certainly yes! However, some things have also been messed up in high-tech, and misses will continue, but every mistake is an opportunity to improve. How else do you think systems architecting came about?
 

By the way, you wouldn’t be talking about politics!

 
I didn’t, we just talked about decision making.
 

A flying start in modern C++

C++ training
Despite a host of up-and-coming alternatives, C++ is still a force to be reckoned with, certainly in the legacy-fraught high-tech industry. Drawing from almost 25 years of experience, computer programming enthusiast Kris van Rens introduces coding rookies to the language basics and essential best practices in his new C++ Fundamentals training at High Tech Institute.

 

Over the years, a long list of programming languages has been put forward to supplant C++. D, Rust, Apple’s Swift, recent Google addition Carbon and lesser-known alternatives like Nim and Vale, to name a few – they all have their merits and their specific application areas. Nonetheless, C++ is still very much alive and kicking, contends computer programming enthusiast Kris van Rens.

“There’s this widely held view that you shouldn’t use C++ because it’s outdated,” says Van Rens. “But it’s actually this view that’s outdated. It’s based on old-style C++. Ever since modern C++ was born in 2011, the language has kept moving with the times. With the right provisions, it’s no less relevant than up-and-coming alternatives like Rust.”

In the new 4-day training course “C++ fundamentals,” organized by High Tech Institute in the last two weeks of March, Van Rens introduces participants to the language basics and essential best practices. “I aim to impart a positive vibe about C++. I want participants to leave feeling that they can really do something with it and knowing how to put it to good use.”

 

Bread and butter

Van Rens has been captivated by the wonderful world of programming ever since he first laid his hands on his dad’s ZX Spectrum home computer. “It was a machine from 1983, the year I was born. When I was 7 or 8, I started tinkering with it, using the Basic programming language. In high school, in an extracurricular activity, a fellow student taught me X86 real-time assembly, followed by C and then C++. In the past few years, I’ve been delving into Rust as well.”

After a bachelor’s in mechatronics at Niederrhein University of Applied Sciences in Krefeld, Germany, Van Rens did a master’s in electrical engineering at Eindhoven University of Technology (TUE), specializing in video coding and architectures under the supervision of professor Peter de With. “The university is where I became a software engineer and had my first taste of teaching. In parallel to my graduation project, which was about converting MPEG-2 into H.264 video streams, I created an image and video coding tutorial – for my own understanding but also just for the fun of it and to convey that fun to others. Spreading my enthusiasm has always been an important driver for me.”

 
C++ training Kris van Rens

In 2009, Van Rens started his professional career at his current employer, the smart surveillance specialist Vinotion. At the time, this TUE spinoff was still a startup, working from an office space belonging to De With’s research group. “From coding image analysis algorithms in C++, my focus slowly shifted to the development platform as a whole, including the language itself, the programming interfaces and the tooling. That became my bread and butter: enabling the creation of robust, high-performance, high-quality code using solid software architecture.”

Van Rens’ career in training got a leg up when he was asked to speak at one of the informal 040coders meetings for computer programmers in the Eindhoven region. “The meeting was hosted by Philips Image Guided Therapy and I did a skit about introducing kids to C++. Afterward, IGT invited me to give a serious presentation to their engineers. So I did a tryout on an in-depth C++ topic. They liked it so much that it’s grown into a quarterly event – so far all online, due to Covid, with me presenting from my attic to crowds as big as 150 people.”

 

Encounters with legacy

The new C++ classroom training at High Tech Institute is targeted at much smaller groups of max a dozen software engineers with basic programming skills – any language will do. It’s also much more hands-on, with practical exercises drawn from Van Rens’ 10+ years of industrial experience. “These exercises aren’t theoretical and made-up like I’ve seen in so many other courses. They’re real industrial cases, inspired by the problems I encounter in my daily work.”

A key concept addressed by Van Rens is the craft of systems programming. “When you’re writing embedded software for a high-tech system, you’re much closer to the hardware than when you’re creating a web application,” he points out. “In Javascript, for example, you generally don’t have to concern yourself with things like memory management; the interpreter takes care of that for you. In an embedded system, you do have to worry about the often limited resources and how to use them wisely. Systems programming is all about being aware of the intricacies on all levels and having the flexibility to act accordingly. The training provides the handles for that. This knowledge applies to any systems programming language.”

''A lot of legacy is written in some older version of C++.''

The omnipresence of legacy code in the high-tech industry is another reason why C++ and his training are so relevant, notes Van Rens. “A lot of that legacy is written in some older version of C++. Sooner or later, you’ll come across this code. Rewriting it or programming against it in Rust or another language is almost never a viable option; you’ll have to deal with it in C++, and my training will help you do that.”

To be prepared for legacy code, participants obviously need to know a thing or two about old-style C++, but the emphasis is on the safe, modern aspects. “After the first standardization in 1998, the language didn’t change much for a long time. It wasn’t until 2011 that C++ underwent a true metamorphosis with the introduction of modern concepts like smart pointers for safer memory usage. Since then, the language is regularly updated, and so are the best practices,” explains Van Rens. “My primary focus is on the state of the art. Later on in the training, I also equip the participants for their inevitable encounters with old-style C++ by going into the evolution of the language and providing exercises in which pieces of legacy code have to be rewritten using modern constructs.”

 
C++ training by Kris van Rens

 

Supercharged learning

The fundamentals taught by Van Rens in the upcoming training are more than enough for participants to get off to a flying start, but they’re only a fraction of what there is to tell about C++. “It’s a huge language. I use the book “Beginning C++20” by Ivor Horton and Peter Van Weert as a reference during the course. It’s comprehensive, almost a thousand pages long, but even that’s not nearly enough to cover everything, not by a long shot. In four days, I aim to lay a solid foundation, showing participants the ropes and where to go if they want to dive deeper.”

''Nothing beats learning by doing.''

Books and online tutorials only get you so far. Nothing beats learning by doing, maintains Van Rens, pointing to the added value of classroom teaching. “Working with C++ since 1998, I’ve built up almost 25 years of experience that I’m amply sharing in class. I’m not just explaining the language; I know where to put the right emphasis, supporting my story with relevant best practices, examples and exercises from industry. It’s supercharged learning.”

 
This article is written by Nieke Roos, tech editor of Bits&Chips.

Trend 5: Tooling

Trainer System Requirements Engineering
High Tech Institute trainer Cees Michielsen highlights a handful of trends in the field of system requirements engineering. He provides the 2-day training System requirements engineering improvement for High Tech Institute several times a year.

In each System requirements engineering improvement training the question arises: what it the best tool to manage our requirements? To answer this question you would need to know all the bells and whistles of requirements management tools, including the latest updates. At the same time, you need a good understanding of the operating environment where you want to insert the tool. In short: this is an impossible task.

Don’t stress, there is still a first distinction to make. It is best to do that based on the type of business you are working with. An organization like ASML is not required to conform to international safety standards, information security or FDA regulations. Even Word or Excel is then suitable to manage everything – and surprisingly this seems to be the preference.

When norms and regulations are involved, the software must support audit trails. That means providing basic functionality for configuration management, like version management, change management and status administration. It’s quite amazing, but many requirement management tools don’t provide these basic features.

Another distinction to make is the baseline. In its most basic form, a baseline is a snapshot of the requirements database, containing a subset of requirements. More advanced tools provide workflows to select specific requirements with specific characteristics from the database. Such a baseline can be reviewed, modified, approved and released as a separate entity in the tool. Once released, the content of the baseline can no longer be changed.

'In its most basic form, a baseline is a snapshot of the requirements database, containing a subset of requirements'

In some industries, like the automotive industry, is the exchange of baselines between OEM and suppliers a standard procedure (better known as the Lastenheft-Pflichtenheft information exchange), but internally it is also useful. They create tranquility and stability in an environment that is known for changing requirements through multidisciplinary product development.

The decoupling of different dynamics from baselines can be very effective. This is also the base of my earlier comment about Word and Excel. They can, with their relatively slow pace of change, dictate an appropriate heartbeat for initiating change in the development organization. For example, a fast-changing agile way of working with biweekly sprints and a long lead time with quarterly updates and annual releases.

A strong argument in favor of using RM-tools, is the traceability support. Most suppliers of tools advertise that their tool is suitable to couple requirements to other requirements. As I wrote in my column on traceability, this doesn’t seem to happen in practice. It is even being discouraged.

Traceability is all about finding the source of requirements. These sources are either a requirements analysis statement or a higher level design decision. Therefore, it is essential that the tool offers the possibility to create other entities than requirements, and is able to make specific relations (trace-links) between requirements and design decisions for example.

Again, you will be amazed at the amount of tools that cannot support this.

In my previous column I presented the W-model in response to the need to assign properties (such as mass and volume) to system elements. You would expect this to be supported by tool suppliers that call their tools Product Lifecycle Management (PLM) like Siemens, IBM, Dassault, Contact Software and PTC. Unfortunately, most of these PLM-tools see their first CAD-drawings as the beginning of the lifecycle of the product. I’m just highlighting it: this is almost at the end of the development process!

'It is essential that the tool offers the possibility to create other entities than requirements, and is able to make specific relations (trace-links) between requirements and design decisions for example'

Put differently: these tools completely skip the left legs of the W-model (from birth to adolescence). In the past years these PLM-tool suppliers have tried to fill the gap by acquiring model-based systems engineering tool providers or by offering interfaces to (Enterprise Architect, Capella, No Magic) and then letting the poor customer deal with the integration and interface issues, while also being left with outdated concepts such as RFLP, incompatible terminology, complete absence of the product properties concept and the lack of the ability for attributes on relationships.

A couple of popular requirements management tools are expanding their functionalities to systems engineering, like Doors, Polarion, Relatics, TopTeam and Jama. They are being confronted with the preconditions that must be met from an SE-perspective: proper configuration-, change- and release management, cross-context traceability, life cycle support for system elements, baselining and more.

Returning to our original topic of the day: I still have to see a RM-tool that does not only support the management aspects, but also the engineering aspects. Like the quantivation of requirements with tags and qualifiers according to Gilb; the relation of functions and function-properties (also to get rid of the outdated split of the functional and non-functional requirements); how to write property-specific requirements; support for ensuring the intrinsic quality of requirements (functionality provided by tools like QVScribe); easy comparison of the system as required, the system as designed and the system as tested. A requirement is more than just a textual description.

In the column about traceability I mentioned that requirements tooling can help with answering the question: why does this requirement exist and why does it have this value? PLM-tooling can help with answering questions such as ‘If this element fails in the field, can tooling help me find the system function(s) affected by this failing element?’