True, but higher demand for heating in winter is not specific to Germany. In the end, Germany will likely need to import energy anyway - just as it does today. But this means this question may not be the most relevant.
It will be a mix. There is certainly still a lot which can be done on the demand side, e.g. when to cool a cool storage house during the day, or when to run certain production lines.
No, capacity factor is a distraction. The only meaningful question is whether it is cost effective or not compared to other production methods,, and solar - taking into account the low capacity factor - is starting to look very good.
"That makes everything slow, inefficient, and widely dangerous."
There nothing faster and more efficient than building C programs. I also not sure what is dangerous in having libraries. C++ is quite different though.
Of course there is. Raw machine code is the gold standard, and everything else is an attempt to achieve _something_ at the cost of performance, C included, and that's even when considering whole-program optimization and ignoring the overhead introduced by libraries. Other languages with better semantics frequently outperform C (slightly) because the compiler is able to assume more things about the data and instructions being manipulated, generating tighter optimizations.
I was talking about building code not run-time. But regarding run-time, no other language does outperform C in practice, although your argument about "better semantics" has some grain of truth in it, it does not apply to any existing language I know of - at least not to Rust which is in practice for the most part still slower than C.
Neither "ODR violations" nor IFNDR exist in C. Incompatibility across translation units can cause undefined behavior in C, but this can easily be avoided.
The ODR problem is much more benign in C. Undefined behavior at translation time (~ IFNDR) still exists in C but for C2y we have removed most of it already.
You can't fundamentally solve the issue of what happens if you call a function in another TU that takes a T but the caller and the callee have a different definition of T.
Whether you call that IFNDR or UB doesn't make much of a difference.
C++ mitigates that issue with its mangling (which checks the type name is the same), Rust goes the extra mile and puts a hash of the whole definition of the arguments in the symbol name.
C has the most unsafe solution (no mitigation at all).
C++ ODR requires different definition to consist of the same tokens, and if not the the program is IFNDR. Name mangling catches some stuff, but this becomes less relevant today with more things being generated via templates from headers.
In C, it is UB when the types are not compatible, which is more robust. In practice it also easy to avoid with the same solution as in C++, i.e. there is a single header which declares the object. But even if not, tooling can check consistency across TU it is just not required by the ISO standard (which Rust does not have, so the comparison makes no sense). In practice, with GCC a LTO build detects inconsistencies.
Different parts of the build seeing inconsistent definitions of the same name is a clear consequence of building things piecemeal rather than as a single project -- which is precisely the problem I described higher up in this thread.
Things being built piecemeal also likely won't be using LTO (even if fat LTO allows this, no static library packages in a distro are built with it).
Not sure what you are trying to say. Inconsistent definitions is a consequence of being able to build things separately, which is a major feature of C and C++, although in C++ it does not work well anymore. The reason to not build things piecemeal is often that LTO is more expensive. But occasionally running this will catch violations. When libraries get split up, the interface is defined via headers and there is little risk to get an inconsistency. So really do not think there is a major problem in C.
reply