agent327 1 day ago

The answer to this is to replace default-init by zero-init. This removes all special cases and all surprise, at a cost that is minimal (demonstrated experimentally by its implementation in things like Windows and Chrome) or even negative. Doing so would make software safer, and more reproducible, and it would make the object model more sound by removing the strange zombie state that exists only for primitive types.

Of course we should provide a mechanism to allow large arrays to remain uninitialized, but this should be an explicit choice, rather than the default behaviour.

However, will it happen? It's arguably the easiest thing C++ could do to make software safer, but there appears to be no interest in the committee to do anything with safety other than talk about it.

3
shultays 1 day ago

  Of course we should provide a mechanism to allow large arrays to remain uninitialized, but this should be an explicit choice, rather than the default behaviour.
First you are saying "cost is minimal even negative" and then already arguing against it on the next paragraph.

ddulaney 1 day ago

The general cost over a several large codebases has been observed to be minimal. Yet, there are specific scenarios where the costs are real and observable. For those rare cases, an explicit opt-in to risky behavior makes sense.

shultays 1 day ago

  The general cost over a several large codebases has been observed to be minimal
Is this unexpected? A large code base has a lot of other things and it is normal that such changes will be a rounding error. There are lots of other bottlenecks that will just overwhelm such a such change. I don't think "it is not affecting large code bases as much", you can use that argument for pretty much anything that adds an overhead

Not to mention if you change every int a to int a=0 right now, in those code bases, a=0 part will likely to be optimized away since that value is not being (shouldn't be) used at all and likely will be overwritten in all code paths

monkeyelite 1 day ago

We all agree, poor defaults were chosen in C++ across the board. we have learned a lot about languages since then.

The question is what to do about it - balancing the cost of change to code and to engineers who learned it.

> but there appears to be no interest in the committee to do anything with safety other than talk about it.

There is plenty of interest in improving C++ safety. It’s a regular topic of discussion.

Part of that discussion is how it will help actual code bases that exist.

Should the committee do some breaking changes to make HN commenters happier, who don’t even use the language?

112233 1 day ago

There is no hope for committee. In C++33 we will probably have variables defined as

    int const<const> auto(decltype(int)) x requires(static) = {{{}}};
And when asked how on earth did this happen and why, there will be the same "we must think about the existing code, the defaults were very poor"

Meanwhile they absolutely could make sane defaults when you plonk "#pragma 2033" in the source (or something, see e.g. Baxter's Circle compiler), but where would be the fun of that.

They still use single pass compiling (and order of definitions) as the main guiding principle...

intelVISA 22 hours ago

The root issue is that the committe has no incentive to improve the language when the current situation enriches its key members, C++ is just the vehicle they co-opted to sell books, or consulting, on solving problems that they perpetuate.

monkeyelite 1 day ago

I’m with you - the features they add are baffling.

> we must think about the existing code, the defaults were very poor"

What does adding bad features have to do with maintaining defaults?

112233 1 day ago

A lot of totally deranged contortions in C++, if you are lucky to find someone sharing the "secret" committee discussions, end up being caused by interaction of ODR, ADL, decay and other equally disease-sounding features. Like, I'd like to know how many people that can use C++ comfortably could write implementation of std::move and std::forward without looking. Or even tell which is which. Actually, let's try:

    template<typename T> T&& function_1(remove_reference_t<T> &x) {
        return static_cast<T &&>(x);
    }
    template<typename T> remove_reference_t<T> && function_2(T &&x) {
        return static_cast<remove_reference_t<T &&> >(x);
    }
Which one is move and which is forward?

Now, since all this jenga must remain functional, the only option they have to try and fix stuff that is a complete misfeature (like "this" being a pointer) is either requiring you to write make_this_code_not_stupid keywords everywhere ("explicit", "override", ...), or introduce magic rewriting rules (closures, range for, ...)

some fixes go much lower, "decltype(auto)" being a pinnacle of language design that will unlikely be surpassed.

int_19h 15 hours ago

Why do you believe that `override` belongs in that category?

112233 13 hours ago

Because the more desirable fix was requiring explicit annotation for "new" virtual functions, and make silent hiding an error. Instead, we have situation where you have to write "override" everywhere to get the desired sane behaviour.

agent327 1 day ago

I was not proposing sweeping changes to all the defaults in C++, I was proposing to adopt a single, specific change. That change does not break any existing code, removes pitfalls from the language, and has already been tried by industry and found to be beneficial. Why is it not in C++26?

https://open-std.org/jtc1/sc22/wg21/docs/papers/2023/p2754r0... provides what appears to be the answer to this question: "No tools will be able to detect existing logical errors since they will become indistinguishable from intentional zero initialization. The declarations int i; and int i = 0; would have precisely the same meaning." ...yes, they would. _That's the point_. The paper has it exactly the wrong way around: currently tools cannot distinguish between logical error and intentional deferred initialization, but having explicit syntax for the latter would make the intention clear 100% of the time. Leaving a landmine in the language just because it gives you more warnings is madness. The warning wouldn't be needed to begin with, if there were no landmine.

I'm not sure what you mean with "who don't even use the language". Are you implying that only people that program professionally in C++ have any stake in reliable software?

monkeyelite 1 day ago

> Are you implying that only people that program professionally in C++ have any stake in reliable software?

No. I’m saying people who don’t understand C++ aren’t going to have good ideas about how to make initialization better.

And yes - comments which simply name call can be safely discard.

agent327 1 day ago

I'm confused. Who's name calling? Are you trying to imply stuff about me (and if so, what gave you the idea I don't understand C++?)?

Please read this thread, in particular the comments by James20k (who, incidentally, is a C++ standards committee member): https://www.reddit.com/r/cpp/comments/yzhh73/p2723r0_zeroini...

This was three years ago. He enumerates numerous reasons for introducing zero-initialisation as I described, and yet somehow here we are, with C++26 just around the corner, and safety on everyone's lips. Is it part of C++26 now? Nope...

BlackFly 1 day ago

> Should the committee do some breaking changes to make HN commenters happier, who don’t even use the language?

As phrased, you clearly want the answer to this question to be no, but the irony there is that that is how you kill a language. This is simply survivor bias, like inspecting the bullet damage only on the fighter planes that survive. You should also be listening to people who don't want to use your language to understand why they don't want that, especially people that stopped using the language. Otherwise you risk becoming more and more irrelevant. It won't all be valuable evidence, but they are clearly the people that cannot live with the problems. When other languages listen, better alternatives arise.

monkeyelite 1 day ago

> you clearly want the answer to this question to be no

Uh yes. It’s phrased that way because it’s absurd. About half the comments in this section are a form of name calling by people who don’t understand constructors/destructors.

Those people who have no insight into how to make initialization better.

> Otherwise you risk becoming more and more irrelevant

Relevancy is relative to an audience. You want to listen to people who care and have your interests in mind.

C++ and Java are the most relevant languages in terms of professional software engineering.

BlackFly 1 day ago

I would say the answer is to replace both with explicit init unless you explicitly say some equivalent of "Trust me, bro," to the compiler. Some structs/data (especially RAII structs backing real resources) have no sensible default or zero.

But yeah, most structs have a good zero value so a shorthand to create that can be ergonomic over forced explicitness.

agent327 1 day ago

That would be a breaking change though. Having default zero-init would apply to existing source and convey it's benefits simply by recompiling, without any engineering hours being required.