I recently started a pet project using modules in MSVC, the compiler that at present has best support for modules, and ran into a compiler bug where it didn't know how to compile and asked me to "change the code around this line".
So no, modules aren't even here, let alone to stay.
Never mind using modules in an actual project when I could repro a bug so easily. The people preaching modules must not be using them seriously, or otherwise I simply do not understand what weed they are smoking. I would very much appreciate to stand corrected, however.
Here’s the thing I don’t get about module partitions: They only seem to allow one level of encapsulation.
Program
- Module
- Module Partition
whereas in module systems that support module visibility, like Rust’s, you can decompose your program at multiple abstraction levels: Program
- Private Module
- Private Module
- Private Module
- Public Module
- Public Module
Maybe I am missing something. It seems like you will have to rely on discipline and documentation to enforce clean code layering in C++.Like most languages with modules.
Rust, Modula-2 and Ada are probably the only ones with module nesting.
Notably many languages in ML family have first class modules.
Rust's re-exports also allow you to design your public module structure separate from your internal structure.
I don't think you're missing something. The standards committee made a bad call with "no submodules", ran into insurmountable problems, and doubled down on the bad call via partitions.
"Just one more level bro, I swear. One more".
I fully expect to sooner or later see a retcon on why really, two is the right number.
Yeah, I'm salty about this. "Submodules encourage dependency messes" is just trying to fix substandard engineering across many teams via enforcement of somewhat arbitrary rules. That has never worked in the history of programming. "The determined Real Programmer can write FORTRAN programs in any language" is still true.
https://arewemodulesyet.org/ gives you an overview which libraries already provide a module version.
Wow, the way this data is presented is hilarious.
Log scale: Less than 3% done, but it looks like over 50%.
Estimated completion date: 10 March 2195
It would be less funny if they used an exponential model for the completion date to match the log scale.
C++ templates and metaprogramming is fundamentally incompatible with the idea of your code being treated in modules.
The current solution chosen by compilers is to basically have a copy of your code for every dependency that wants to specialize something.
For template heavy code, this is a combinatorial explosion.
D has best-in-class templates and metaprogramming, and modules. It works fine.
It has worked perfectly fine while using VC++, minus the usual ICE that still come up.
It works perfectly when it comes to `import std` and making things a bit easier.
It does not work very well at all if your goal is to port your current large codebase to incrementally use modules to save on compile time and intermediate code size.
The fact that precompiled headers are nearly as good for a much smaller investment tells you most of what you need to know, imo.
Can someone using modules chime in on whether they’ve seen build times improve?
import std; is an order of magnitude faster than using the STL individually, if that's evidence enough for you. It's faster than #include <iostream> alone.
Chuanqi says "The data I have obtained from practice ranges from 25% to 45%, excluding the build time of third-party libraries, including the standard library."[1]
[1]: https://chuanqixu9.github.io/c++/2025/08/14/C++20-Modules.en...
Yeah, but now compare this to pre-compiled headers. Maybe we should be happy with getting a standard way to have pre-compiled std headers, but now my build has a "scanning" phase which takes up some time.
From the outside looking in, this all feels like too little too late. Big tech has decided on Rust for future infrastructure projects. C++ will get QoL improvements… one day and the committees seem unable to keep everyone happy or disappoint one stake holder. C++ will be around forever, but will it be primarily legacy?
why use modules if PCH on your diagram is not much worse in compile times?
Macro hygiene, static initialization ordering, control over symbol export (no more detail namespaces), slightly higher ceiling for compile-time and optimization performance.
If these aren't compelling, there's no real reason.
Having implemented PCH for C and C++, it is an uuugly hack, which is why D has modules instead.
modules are the future and the rules for are well thought out. Ever compiler has their own version of PCH and they all work different in annoying ways.
I can’t deploy C++ modules to any of the hardware I use in the shop. Probably won’t change in the near-to-mid future.
It seems likely I’ll have to move away from C++, or perhaps more accurately it’s moving away from me.
If you tools are not updated that isn't the fault of C++. You will feel the same about Rust when forced to used a 15 year old version too (as I write this Rust 1.0 is only 10 years old). Don't whine to me about these problems, whine to your vendors until they give you the new stuff.
When one of the main arguments people use to stick to C++ is that it "runs everywhere", it actually is. After all, what use is there for a C++ where the vast majority of the library ecosystem only works with the handful of major compilers? If compatibility with a broad legacy ecosystem isn't important, there are far more attractive languages these days!
Just like Python was to blame for the horrible 2-to-3 switch, C++ is to blame for the poor handling of modules. They shouldn't have pushed through a significant backwards-incompatible change if the wide variety of vendor toolchains wasn't willing to adopt it.
> If you tools are not updated that isn't the fault of C++.
It kinda is. The C++ committee has been getting into a bad habit of dumping lots of not-entirely-working features into the standard and ignoring implementer feedback along the way. See https://wg21.link/p3962r0 for the incipient implementer revolt going on.
My experience with vendor toolchains is that they generally suck anyway. In a recent bare metal project I chose not to use the vendor's IDE and toolchain (which is just an old version of GCC with some questionable cmake scripts around it) and instead just cross compile with rust manually. And so far its been a really good decision.
Yep, this aligns with my experience. I’ve yet to take the plunge into cross compiling with rust though, might have to try that.
Nobody is "whining" to you. Nobody is mentioning rust. Your tone is way too sharp for this discussion.
> whine to your vendors until they give you the new stuff.
How well does this usually work, by the way?
If C++ libraries eschew backward compatibility to chase after build time improvements, that’s their design decision. I’ll see an even greater build time improvement than they do (because I won’t be able to build their code at all).
This is not an argument against modules. This is an argument against allowing areas that don’t upgrade hold modern c++ back.
“C includes show it age.” But C++ is stating not because of there is a “++” there but because of there is a “C”.
> auto main() -> int {
Dude…
i was sincerely hoping i could get
auto main(argc, argv) -> int
int argc;
char **argv;
to work, but alas it seems c++ threw pre-ansi argument type declarations out.And their code example doesn't actually return a value!
For main it's explicitly allowed by the standard, and no return is equal to return 0
It's like calling a Ford Mustang Mach-E the "Model T++."
It's been the go-to syntax for 15 years now
Go-to? I've never seen a project use it, I've only ever seen examples online.
Same here
Now I haven't touched C++ in probably 15 years but the definition of main() looks confused:
> auto main() -> int
Isn't that declaring the return type twice, once as auto and the other as int?
No. The auto there is doing some lifting so that you can declare the type afterwards. The return type is only defined once.
There is, however, a return type auto-deduction in recent standards iirc, which is especially useful for lambdas.
https://en.cppreference.com/w/cpp/language/auto.html
auto f() -> int; // OK: f returns int
auto g() { return 0.0; } // OK since C++14: g returns double
auto h(); // OK since C++14: h’s return type will be deduced when it is defined
What about
auto g() -> auto { return 0.0; }
I really wish they had used func instead, it would have saved this confusion and allowed for “auto type deduction” to be a smaller more self contained feature