Optimisation is dead time. I swore I’d never do it. Nothing but scripting languages, I swore. Garbage collection is not my business. Except, as it turns out, CPU cycles aren’t free, and I want a lot of them. So, as long as I need to make do with the slight resources available to me, I might need to commit some time to the occasional rewriting of inner loops in something a bit fast, with decent libraries written by those more speed-obsessive than myself. C++ passes muster in this regard.
I am investigating this typically for rewriting inner loops of python simulation code using C++ at the moment, but maybe if i get more chops I’ll try some recreational, creative applications – there are such handy tools for that, dontcha know?
However, for the moment I’m mostly avoiding directly touching C++ at all by using Tensorflow.
Provocative C++ projects
Creative coding, new-wave: Cinder
Creative coding, powerfully but clunky: Openframeworks
Computer vision: OpenCV
in-browser compiled extensions: NaCL…
Thrill s a C++ framework for distributed Big Data batch computations on a cluster of machines. It is currently being designed and developed as a research project at Karlsruhe Institute of Technology and is in early testing.
Some of the main goals for the design are:
To create a high-performance Big Data batch processing framework.
Expose a powerful C++ user interface, that is efficiently tied to the framework’s internals. The interface supports the Map/Reduce paradigm, but also versatile “dataflow graph” style computations like Apache Spark or Apache Flink with host language control flow. …
Leverage newest C++11 and C++14 features like lambda functions and auto types to make writing user programs easy and convenient.
Enable compilation of binary programs with full compile-time optimization runnable directly on hardware without a virtual machine interpreter. Exploit cache effects due to less indirections than in Java and other languages. Save energy and money by reducing computation overhead.
Due to the zero-overhead concept of C++, enable applications to process small datatypes efficiently with no overhead.
Things to read
C++ Reference Wiki is a place to document your own learning.
cplusplus.com has a basic but serviceable tutorial
C++ is only tolerable with Boost, which has an excellent manual for modern C++ coding
Mark C. Chu-Carroll on C/C++ for numerical computing
C and C++ suck rocks as languages for numerical computing. They are not the fastest, not by a long shot. In fact, the fundamental design of them makes it pretty much impossible to make really good, efficient code in C/C++. There’s a good reason that Fortran is still the language of choice for real, intense scientific applications that require the absolute best performance that can be drawn out of our machines. Making real applications run really fast is something that’s done with the help of a compiler. Modern architectures have reached the point where people can’t code effectively in assembler anymore – switching the order of two independent instructions can have a dramatic impact on performance in a modern machine, and the constraints that you need to optimize for are just more complicated than people can generally deal with… So for modern systems, writing an efficient program is sort of a partnership. The human needs to careful choose algorithms – the machine can’t possibly do that. And the machine needs to carefully compute instruction ordering, pipeline constraints, memory fetch delays, etc. The two together can build really fast systems. But the two parts aren’t independent: the human needs to express the algorithm in a way that allows the compiler to understand it well enough to be able to really optimize it… And that’s where C and C++ fall down. C and C++ are strongly pointer-based languages. The real semantics of almost anything interesting end up involving pretty much unrestricted pointers.
Bruce Eckel’s Thinking in C++ book is online.
Yossi Kreinin’s defective C++
Some of [these] by themselves could be design choices, not bugs. For example, a programming language doesn’t have to provide garbage collection. It’s the combination of the things that makes them all problematic. For example, the lack of garbage collection makes C++ exceptions and operator overloading inherently defective. Therefore, the problems are not listed in the order of ‘importance’… Instead, most defects are followed by one of their complementary defects, so that when a defect causes a problem, the next defect in the list makes it worse.