‘Before [const generic], Rust had no native ability to deal with arrays bigger than 32 elements’.
Is this a correct statement? I have seen posts talking about const generics being a new thing as of 2022. Did Rust actually lack the ability to have an array with more than 32 elements? I find it hard to believe that there was no way to have an array of longer length and Rust still being a production level language.
You have always been allowed to have arrays longer than 32 elements, but dealing with them used to be hard. Beyond the Copy trait, which is a compiler builtin, many traits weren't implemented for arrays with more than 32 elements.
And the feature is still limited. For example, legacy users like serde still can't switch to the new const generics based approach, because of the same issue that the Default trait is facing. Both traits could be using const generics, if they were allowed to break their API, but neither want to, so they are waiting for improvements that allow them to switch without doing a hard API break.
Since nobody else mentioned it, it's worth pointing out that what e.g. JS calls an array is Vec in Rust and can be as long as you want, with no ergonomic difference regardless of the length.
Array in Rust specifically refers to an array whose length is known at compile time, i.e. a bunch of values concatenated on the stack, and that's what the limitations applied to.
The quoted statement pissed me off a bit (I otherwise enjoyed the article) because it seems intended to mislead. The author should have known the colloquial meaning of "array", and "no ability to deal with" is factually incorrect.
If you set the array size to 32, then it works. You can get around this by using a macro, instead of `Default`, or implementing Default yourself, but it's still a limitation where you can't use an array of more than 32 elements.
Yes, I know, but the trait limitation only applies to arrays, not to Vec. Many people coming from other languages would reach for Vec first when they want an "array". I believe that misunderstanding the meaning of "array" is why GP was surprised that Rust couldn't (ergonomically) handle more than 32 elements in an "array".
For the first year we did not have Vec because we were no-std + stable so we literally had to use arrays and could not reach out for heap allocated Vecs.
Things got much better after we got std and could use Vec, as you note, but there are still a few locations where we have no choice but to use arrays (ie some crypto APIs that are too risky to redesign, the boot loader, and micro kernel itself which is still no-std come to mind immediately).
> we did not have Vec because we were no-std + stable so we literally had to use arrays
It's true that Vec isn't available in a no-std context, but I don't think it follows that arrays are the only other option - see heapless for one example: https://github.com/japaric/heapless
I also agree with some of the ancestors: the post seems to say that the Rust language couldn't handle arrays with more than 32 elements, and (as someone who's written a fair bit of no-std Rust, before const generic) that doesn't seem right. At first, it did seem awkward to me as well that some macros weren't defined for >32 element arrays, but in practice I haven't found it to be a significant limitation.
Was there a particular scenario where it wasn't feasible to wrap a >32 element array in your own type and implement Default on it?
We tried heapless, but ran into some problems with it. I forget the issue, but it had to do something with a bunch of unsafe code in heapless and the way we did stack frame alignments causing subtle bugs with tuples that had u8's in them. That problem may be resolved now that we're a couple years on.
If I'm not mistaken you can't implement traits on types that aren't in your crates, so, there's that limitation. But generally it's just another layer of friction that feels like it shouldn't be there, and it manifests itself as a form of technical debt. For example inside the microkernel itself there is an array that tracks the connection IDs in and out of a process. It's limited to 32 elements. Back when it was created, that seemed like a lot. Now maybe it'd be nice to bump it up just a little bit...but it would require opening up a can of worms since it never had all the traits implemented around it, so it's just sitting there until it becomes really worth the effort to upgrade that limit. There's a few spots like this.
Ah interesting. I've never dug in to the heapless implementation, but can imagine that getting it working right on a new platform might require a few changes.
It sounds the "orphan rules" that you're referring to; my understanding is that `impl SomeTrait for SomeStruct` needs to be in the same module as either `SomeTrait` or `SomeStruct`.
I bumped in to a similar situation with a project that involved a special buffer for handling digital audio. Initially, I made that buffer generic, and put it in its own module. The thing that used that buffer was in another module, and then everything was brought together in the main application. I wanted the ability to adjust parameters of the buffer from the project's top-level configuration, and can't remember the exact details, but basically with that structure main couldn't be the place that configured both the buffer and the module that dealt with the digital audio peripheral. The solution was pretty simple though: realise that the data structure is only ever used in conjunction with the peripheral, so instead of main including the buffer and peripheral (a dependency graph with edges main-buffer and main-peripheral), put the buffer in the module with the peripheral (edges main-(peripheral&buffer), or main-peripheral and peripheral-buffer, I can't remember which).
If you revisit that problem, it might be worth considering a declaration of a type for the connection IDs and with whatever impls are needed, in the module that wanted the >32 element array.
Perhaps. But in writing an OS, sometimes you genuinely do want the guarantees of an array. You especially would want to avoid the overhead that might come when the Vec gets resized.
Yes, and if you don't need dynamic size you can use an array (of any size). The lack of trait implementations is generally a minor inconvenience in the scale of the various inconveniences of writing an OS. It doesn't stop you doing anything.
You could have bigger arrays, what was missing were the trait implementations. Originally the traits were implemented using a macro, and the macro only generated implementations for up to length 32.
Before const generics most traits were only implemented up to 32 elements though, which could be quite annoying. Even more so as the compilation error was not exactly informative.
There were some awful hacks to make integer parameters to generics sort of work before "const generic" went in. There were tables of named values for 0..32, then useful numbers such as 64, 128, 256, etc. Those haven't all been cleaned out yet.
Is this a correct statement? I have seen posts talking about const generics being a new thing as of 2022. Did Rust actually lack the ability to have an array with more than 32 elements? I find it hard to believe that there was no way to have an array of longer length and Rust still being a production level language.