Aside from the almost completed features in the pipeline, I only see procedural macros (macros 2.0) and whatever happens to make async code easier to write really impacting day-to-day code for most people.
 I'd really like to see F# Computation Expressions instead of async/await. I know the language experts have said Haskell-like do notation doesn't work in Rust but I'm not sure if the F# tweaks would make it work or not.
I'd prefer a systems-y, zero-cost take on algebraic effects, similar to what OCaml is going to get. Could be much more extensible, and open things up to annotating whether functions panic or not, access global state, etc. Alas it's still a tricky research problem, even after all these years. There were some nice discussions from ICFP here - the comments strayed into talking about how effects might be implemented without a GC: https://www.youtube.com/watch?v=DNp3ifNpgPM
For me clippy evokes the image of a well-meaning, if at times clumsy helper that tries to nudge you in the right direction. I find that very fitting for a tool that...well...tries to nudge you in the right direction of writing good, fast, idiomatic code.
Had to look this up - very cute!
OCaml almost fits the bill (Rust is inspired by OCaml after all), but the tooling around it is lacking to put it mildly.
It still doesn't really meet your idea you probably have in your head. When people can choose between different types of pointers, people will choose rust's normal lightweight lifetime references 99% of the time. Gc<T> will probably only be used in those rare cases where an object has no clear owner. They figured this out in the early days of rust.
 - ziglang.org
As Python dev since 10 years I can't point fingers -- Python is probably worse though I've memorized the idiosyncrasies -- but I couldn't justify my way up the tooling learning curve in addition to the language curve.
Do you remember the library? The landscape has changed dramatically in the past ~6 months with .NET Core 2.0 support. For example, I can use Fable and Giraffe with the .NET CLI to build full-stack F# apps on my machine which runs .NET Core. The big remaining blocker for most people to just jump wholesale onto .NET Core and forget anything Windows-based is the lak of Type Provider support, but we're quite close to finishing that.
Is there a viable cross platform ui option for f# on core?
Unfortunately, the fsharp.org site is kind of out of date. I think that's mostly a function (heh) of the MS F# docs becoming much better (can be found here). These instructions for getting a dev environment set up are pretty good. If you hunt around, you can also find blog posts which may be a little more comprehensive.
Since it's useless, not having it is great since you no longer have to worry about it and the problems it causes like random pauses, sawtooth-shaped and excessive memory usage and inability to use swap properly (although you have to worry about heap fragmentation, but usually that's not as terrible).
Also you'd need lifetimes and borrowed references anyway to have static guarantees like that there are no remaining references to mutex-protected data after you unlock the mutex, so having a GC as well actually increases complexity.
I think there's a lot to that. I'd be really interested to try it, though, to see how it pans out.
What I’d like to see is for one language to be possible to use “vertically” through a stack. That is - for example a C# systems version with manual memory that you can use instead of C interop for the small core of your app. Or, the opposite, a dumbed down version of Rust that works well for classic reference heavy UIs and similar.
I get what you're saying though. Another poster mentioned Swift and indeed Graydon Hoare, Rust's creator, is now working on Swift at Apple. And I believe some kind of notion of borrow checking/lifetimes is supposed to be coming to Swift in the future?
I know the Reason guys want to tackle project setup / build / deps in addition to their syntax changes. I've found the current release's bsb toolchain to work pretty well for js targets but I haven't tried to set it up for native compilation.
I'm concerned about the ReasonML -> OCaml -> Bucklescript -> JS compilation chain. The laws of leaky abstraction pretty much guarantees this is not a robust way to do things.
I also wished the ReasonML folks started from scratch, instead of inheriting OCaml's baggage (no forward references, a plethora of file types to deal with, no UTF8 strings without bringing in an external lib, and so on).
You don't need to use bucklescript toolchain to write Ocaml, it has a very good package manager, the build-system tooling is getting better, already has great editor support for many years now(merlin).
I think the tooling scene is much better than Haskell. Not sure what you mean by "no forward references"?
I'm guessing they are talking about having implicit mutual recursion between items in a module, like Haskell has.
I probably got the name wrong, but it's the ability to use a function before it is defined.
In OCaml/ReasonML, you'd have to use the rec keyword and structure your codebase in a particular way to define mutually-recursive functions. It is a small but noticeable papercut, especially since recursion is so common in a functional language.
F# code (with a catch) --> F# AST --> Bable AST --> JS
The big thing here is that the F# code you write has slightly different semantics than "normal" F#. That's because the runtime environment is different, and so you can't escape that. Rather than attemp to gloss this over, the Fable creators are pretty explicit about this, including documenting each of the (small) differences. The result is pretty good. Abstractions don't seem too leaky from my vantage point.
AFAIK there is some work on a GC in Rust, but it doesn't take away the borrow checker like you wished ;) So you will still need to use `.borrow()` and `.borrow_mut()`.
A language similar to Rust but with GC (and no borrow checker) could be quite useful for a lot of applications out there. Basically Go but a more "modern" language.
(ARC, not tracing GC, but still.)
ARC is automatic reference counting. Arc is Atomic reference counting.
In Rust you still have to manually .clone() to addref an Arc. In swift you don't. This is a major difference ergonomics-wise.
I wonder how threading would work without the borrow checker though. It would be cool if you could still have the "fearless concurrency part".
But if you are willing to accept runtime GC, a JIT runtime like JVM/Node might also not be out of the question.
So Scala Native would be spot on, if it were ready for production use.
Not-crazy suggestion: add a `--why` option to rustc that doesn't just shout (helpful) error messages at you but actually explains the architectural theory of why the error is occurring and provides documentation links that help outline the canonical "right way" to do whatever it is you're trying to do.
The idea being that rustc applies a ton of heuristics and intelligence (perhaps by analyzing nearby code) to guess what your high-level goal is.
There have been a couple of "programming helper" AI-type apps float past here recentlyish; that's what I'm getting at.
The reasons I think this would be a good idea is that
a) this would be really really hard to get right, but AI R&D is pretty much at the point where you could put something like this together and build it into the compiler and pull it off really well
b) rust seems to have a ton of energy behind it so if any language was going to implement this, it looks like rust has a fighting chance to actually get it done
c) rust hasn't stabilized yet so _now_ is exactly the time to fold something like this in. it would need to be integrated as early as possible as a bolt-on afterthought would never work the same way
Perhaps some console colouring could help as well. I'd really like some type diffs on type mismatch errors (I think Dotty has this?). That could be super handy for quickly diagnosing errors - speaking as a reasonably experienced Rust user here.
Applies to other languages as well.
This is how I felt, but it wasn't bad: I think you start to pick up on the patterns. In other words, learning isn't short-circuited after you've made rustc happy, even though you're still a bit puzzled - you'll carry the experience along with you and continue to learn from it later.
I must have written around a thousand lines before needing a lifetime annotation, vs. something like 30 lines two years ago.
There's svd2rust  and dslite2svd , which is basically "bindgen for hardware". Please do correct me if I'm wrong, but from what I understood , most SVD files leave out a lot of information so a lot of what svd2rust generates is unsafe. Seems like the only way around that is to patch the SVD files. Is this still the case?
In addition, what about compiling with no stdlib? And how does that interop with existing Crates (ie. do new "embedded-specific" Crates need to be created?)
Super excited to see Rust gain traction! I'll definitely take another shot in my free time and see how the ecosystem has matured.
Nobody with a sane mind would write C or CPP micro-services, but post-Spectre and Meltdown any reclaim in performance is tangibility valuable. Rust could be the one to swoop in and claim the position
Also, it is not yet fit for writing GUI code. It is quite far from what is possible to achieve today in Qt/WPF/Cocoa/Android/... tooling and even the latest NLL improvements don't fix all issues regarding writing callbacks.
So, even if we consider a web app as a GUI, Rust wasn't invented for writing that (you still use CSS, HTML, JS etc for that part). It was invented for writing the backend for that UI.
In other words, Rust is not GTK (a UI library), it's C (the language the UI library itself is written in).
There is nothing in Rust that makes it inherently unfit for APIs or the web. To dismiss the entire language is simply lazy. Having GC also doesn't necessarily make one language superior to another.
While http ecosystem is still in active development, if there was really a choice between Rust and JS for e.g. writing an API I would go with Rust in a heartbeat.
Having a GC means having easier ergonomics to write data structures and distributed algorithms.
I do like a lot Rust's type system, just the ergonimcs aren't quite there yet.
You're correct, but "Rust isn't easy to use for purpose X just yet" is substantially different from "Rust is inherently unfit for X"
Please explain me where is this idea of slow development time coming from? Is it only because people try Rust for two weeks, cannot get the lifetimes and decide the development speed is slow?
But that's not the only constraint. If you want to save memory, due to the environment you run on, or your workload, a non-GC runtime can be pretty awesome.
And in my experience, it's way easier to reason about allocations in Rust that it is in Java (automatic type erasure if you want to use generic code) or Go (how will escape analysis behave in that case ?) for example. Rust traits are way better than C++ interfaces in that regard also.
Rather languages in the Oberon and Modula family, D, Nim, Eiffel, C# as of 7.2, Common Lisp, and a few MLs.
If you do RC everywhere, using library types, it is slower than a tracing GC, and still not as productive.
Inside the Rust community, there's been a lot of debate about what an idiomatic Rust API for a GUI toolkit would look like, particularly because the traditional tree-of-widget-objects design of C++ GUI toolkits appears a bit inconvenient to realize within Rust's stricter semantics.
IOW, "are we gui yet?" might depend on some sort of new written-in-Rust GUI toolkit to appear. There's 20 years of work by many hundreds of people in Qt, many of whom rallied not just around Qt specifically, but around projects using it. Getting that many people moving isn't easy. Alternatively, major progress has to be made on bindings and bridging to the semantics of other languages in Rust.
That said - of course C++ was around quite a while before being adopted by Qt.
There's no version of that concept that's sane at this time, though. Wrapping full-fidelity APIs the browser engine has access to in a lossy abstraction layer of shoddy web platform APIs isn't a practice Servo can improve upon. It takes improving those APIs.
There's a small number of macros that a custom preprocessor expands to (very pedestrian) generated code, but they are optional to use (though it's certainly uncommon not to use them) and these days there's template-based versions of some of them that are preferred and steadily finding adoption over the macro ones because they're superior. Beyond that there's a few more normal preprocessor-based macros that are entirely optional.
In that sense Qt also tracks vanilla C++ very closely and has adopted features from newer language versions at a fairly steady pace and swiftly for such a large production library set.
None of this matters much to my point though - that was about much more basic language semantics like classes-based OOP and the absence of ownership rules.
I need to fulfil task X, with tool Y, then I choose language Z among those supported by Y.
Not I chose language Z, then try to find some kind of Y, that helps me solve X somehow.
When C++ came into the picture we got Turbo Vision on MS-DOS bundled with Borland compilers, Apple adopted Metrowerks tools and used PowerPlant, OS/2 promoted CSet++, UNIX guys were into Motif.h++.
Then came Windows, Borland pushed forward with OWL followed by VCL. Microsoft created MFC, adopted by Symatec and Zortech compilers.
Apple eventually went Cocoa, and Microsoft UWP as of latest.
And then there is Qt, literally the only game in town for C++ GUIs not related to any OS vendor.
Followed by what in possible in Java and .NET eco-systems.
All with good visual tooling support, allowing for productive workflows between developers and designers.
I am not saying that Rust some day won't catch up, and offer something similar in terms of productivity.
I hope it does, but right now in 2018 it doesn't.
Since cloud providers do all the plumbing, Rust would only have to deliver a tiny Linux binary for the function.
why? I do this all the time. Maybe I should check into a nearby asylum.
...and that is flat out the case for c++, unequivocally.
In rust you’d expect it to be better, because of (reasons rust is good here, like safety, having a package manager and an ecosystem, etc etc), but practically, there are too many halfbaked solutions, too much ‘use nightly’ and few good stable proven solutions to look at.
Rust isn’t the right solution for every problem, but it should be a good solution for secure very high performance network services.
Its definitely a good goal for the year~
Arguably, this is why Google made go.
But I'm not sure c++ micro services are a terrible idea. I think the real problems come with a) feature creep as opposed to writing new micro services, and b) the temptation to write "fast" c++.
Just writing plain obvious c++ should give you a leg up on most languages (esp for micro services). But then, rather than refactor and think about the algorithm and data structures (and get a 10x performance benefit from a simpler solution), you could probably get a 2x improvement from a more convoluted, "special" c++ implementation.
And that's where you're likely to encounter elder horrors lurking.
[ed: I'm however more and more convinced there's no such thing as "plain obvious c" that doesn't have a number of serious issues along the lines of not checking mallloc return values, to utf8 string handling etc.]
It's quite common in the embedded space to use C or C++. It's easy to run everything as root and skip fundamental bound checking, so those languages get a bad rep in that space. Totally understandable. But it is possible to engineer solutions that are a lot safer too. It's just a question of priorities. Companies not prioritizing writing safe code or not working on having a safe architectural design will not bother with Rust or other memory safe languages. There's little to no market incentive in most areas. Sad but true.
A common theme in the C++ community is to advocate C++ to devs that only see Assembly and C89, and don't see any reason to move beyond that.
Most recent example "Embedded & C++ - Meeting 2017 Keynote"
But as far as using it as a full featured web framework, a la Rails, Django or even Laravel, I can see why people say it's got a long way to go.
That said, I haven't built anything production or large scale in Rust, so I can't claim to be an expert.
Using a safer language makes it easier and less prone to out-of-bounds memory accesses.
By using C/C++ you're choosing to make a hard problem harder.
Oh and btw, "out of bounds memory accesses" are not the primary source of security vulnerabilities at all. In fact, such out of bounds access is more likely to just make your program crash (which is a good thing).
 shameless plug: https://github.com/duneroadrunner/SaferCPlusPlus
Software written in higher level languages is often less widely deployed, so less people are trying to find vulnerabilities. All the complexity adds up, and can result in new vulnerabilities that wouldn't have been there if the code was written in a simpler language.
If you do, make sure it says "believes in micro-services" on the referral ;-)
Everything else is Spring Boot (java), Akka (Scala) or Nodejs. Apparently nobody wants to embrace a language designed for offshoring, so no Meetups, noisy user groups and so on... even Clojure is more popular
"The key point here is our programmers are Googlers, they’re not researchers. They’re typically, fairly young, fresh out of school, probably learned Java, maybe learned C or C++, probably learned Python. They’re not capable of understanding a brilliant language but we want to use them to build good software. So, the language that we give them has to be easy for them to understand and easy to adopt. – Rob Pike 1"
"It must be familiar, roughly C-like. Programmers working at Google are early in their careers and are most familiar with procedural languages, particularly from the C family. The need to get programmers productive quickly in a new language means that the language cannot be too radical. – Rob Pike 2"
So a language that makes it quite easy for enterprises to deal with developers as cogs.
Plus the performance benefits will be nice.
It doesn't mean that your services are written in C/C++.
True there is the Rust Programming Language v2 book and several other learning resources online, but they all carry redundant information and pass it on the user to discover what they want to learn. A legendary wiki which acts like a cookbook of all things Rust could be awesome and anyone can pick what they want to learn about and go at it.
I would happily contribute and do it, but I'm just starting to learn Rust. Perhaps it is easier for other experienced Rustaceans to do it right.
I thought it was a small effort to show around the language features. Like a live(runnable code) version of the Rust programming language book. Very useful for someone trying learn the language feature by feature.
What I'm writing about is a how-to style wiki. For example,
How to do things with Rust:
How to read from stdin?
How to read and write from a text file?
How to make a REST API call from Rust?
How to parse JSON content in Rust?
How to connect to PostgreSQL in Rust?
How to issue Unix commands to a remote host via SSH using Rust?
How to create threads for parallel work in Rust?
You get the idea..
If I learn something new, I'll try to add to their repo.
For me, while I'd really like to try Rust, these are a deal-breaker.
Does anyone know a ballpack figure for how many LoC/s you can expect on a beefy workstation?
Furthermore, this is the start of incremental: we have more stuff coming down the pipeline.
For example, local variables of a function get destroyed on function exit no matter what and you can't prevent it by adding lifetime annotation to a local variable reference you are trying to return.
That was a stumbling block for me.
Would you mind if I made a pull request to the Rust book adding your explaination to the lifetime chapter ?
> Lifetime annotations don’t change how long any of the references involved live. In the same way that functions can accept any type when the signature specifies a generic type parameter, functions can accept references with any lifetime when the signature specifies a generic lifetime parameter. What lifetime annotations do is relate the lifetimes of multiple references to each other.
"Descriptive and not prescriptive" part probably comes from stackoverflow. I don't remember exactly.
Lastly, I would first advise you to work through https://learnxinyminutes.com/docs/rust/ to get a feel of the language (like dipping your toes into the water). If you like what you see, then continue from the Rust book: https://doc.rust-lang.org/book/second-edition/ .
And most importantly remember to have fun!
Mozilla's IRC server has a #rust-beginners channel that is like, overwhelmingly helpful. Go and ask stupid questions.
"The Rust Programming Language" is good:
Knowing the difference between the stack and the heap, and having a picture of it in your head and understanding which variables go where, and what happens to the stack when calling and returning from functions, is pretty important. (Python wouldn't give you this intuition, although programming in something like Java might.)
I half-assed learned to code in C++ for scientific purposes. Later I got deeper into Rust, and it's made me a better, safer C++ programmer when I run into C++ code.
Really looking forward to learning from your material, when it's ready.
Anyway, the other comments about reading the docs and building things are correct. I'd just point out two things in addition.
1) Don't feel like you have to go write some impressive low level application just because the language is lower level - fizzbuzz works fine too.
2) Don't worry too much about whether you're strong enough on systems concepts. It's not a class you fail if you show up unprepared for; it's (presumably) play time you're engaging in to scratch your own itch. You'll make mistakes but that's not something to be avoided, it's the only way to learn.
You might give Nim a try.
Coming from Python, you might find Nim's syntax (and significant whitespace) familiar. It took me (also a Python programmer) a week or so before I was able to write small programs in Nim.
(Don't let "not yet v1.0" to turn you off)
One thing that put me off in the end was that it didn't support native OS threads (I have the same problem with Racket). Has threading support improved over the years?
The main push in this area is Multicore OCaml. I don't track OCaml closely enough to know how getting this into the main compiler is going.
However, because I am used to writing multithreaded code using pthread or std::thread, I find the standard OCaml model uncomfortable and limiting. Its mostly a matter of personal taste but this is the reason I stopped using OCaml.
Nope, different structs could have fields named the same.
For instance there is an instance of the Racket VM per place and this makes them heavier than it would be in a standard system programming language language such as C, C++ or Rust.
I am not saying it is better or worse -- just different.
Regarding books, Real World Ocaml is very well written, although some examples might need tweaking.
Mostly wondering how people ship stuff if there are different variants of std libs? Say you write your code entirely using one std lib and then try to share it (the code that is) with somebody who uses a different std lib, then the onus of learning the new std lib is entirely on him, right? How does that work for teams then? I know that Jane Street is a big user and so they have considerable influence. That being said, does the std lib (the one bundled with the distro) have enough features (like all the necessary data structures, for instance) to make using it viable in a non-trivial project (off the top of my head, say writing a very simple web server)? If so then, I suppose that's good enough.
Any recommendations for learning material? I have a copy of "Practical OCaml", don't really like it. INRIA's materials seem quite substantial, but more like reference.
The migration from standard stdlib to alternate ones is fairly trivial(some signatures might differ but typed language means the compiler will catch them for you), so you can start using the stdlib and if you feel like writing some trivial things that should be in the stdlib or you need some advanced data structures you can use the alternate ones.
Also you need `lwt`(most people prefer this)/`async` library for writing concurrent programs like a webserver. `Cohttp` is a good library for doing http based client/servers. I strongly recommend `Real world ocaml` and the reference manual is excellent at looking up some new type-level features. Also reading well-known libraries code is an excellent way to learn a language.
Browse Python projects on github and find some that strike your fancy. Especially those which may run slow, like puzzle solvers, image manipulation apps, etc. Bonus points if they have few external dependencies. Rewrite them in Rust.
I think Rust is very good for C++ experts. Some key concepts like ownership / RAII and etc. are well known in C++, and they should be easy to understand.
Compiler-enforced safety and a built-in package manager are really nice, but it doesn't quite make up for the loss of a lot of handy C++ features.
It is new, it is shiny but lets wait its 10th birthday before it can be taken seriously and pls provide a decent async io / http library before we even get started what else can be done in Rust.
I have found Rust library coverage to be really good for my requirements. Hyper (https://hyper.rs/guides/client/basic/) provides an async HTTP client although I have never needed it.
Personally, if I had an application that was heavily using HTTP I would choose a different language -- Java, Scala and Go would be good choices.
Http/2 client and server https://github.com/carllerche/h2
Web framework https://github.com/actix/actix-web
All of them has good quality and performance
Using it at the moment and it's been great to use: documentation was good, API was straightforward, everything so far has worked practically first time.
Have you not looked at it for a while or don't you think tokio/hyper is good?
A work-in-progress rearchitecting of the tokio-core crate in line with tokio-rs/tokio-rfcs#3
I had to look up on Wikipedia when Rust was conceived: 2010. It took me another moment to realize that it's already been 8 years.
We should be flying in spaceships programmed with Rust by now. This is ridiculous.
Which is a shame, because the article is otherwise fine.
If someone hasn’t bothered to take the time to proof-read their work, what does it say about the topic they are writing about? I don’t mind if the mistakes are because they are learning English, but I’m talking about really obvious spelling mistakes.
(I checked - TIL an 'erk' is a male member of the RAF of the lowest rank.)
Or did I just miss the ironic joke?
`s/\!\+/./` would cut it, though.
(also, the author isn't a "guy")
From the luafun library readme (which uses iterators under the hood):
> Let's see an example:
> -- Functional style
> require "fun" ()
> -- calculate sum(x for x^2 in 1..n)
> n = 100
> print(reduce(operator.add, 0, map(function(x) return x^2 end, range(n))))
> -- Object-oriented style
> local fun = require "fun"
> -- calculate sum(x for x^2 in 1..n)
> print(fun.range(n):map(function(x) return x^2 end):reduce(operator.add, 0))
-- skip some initilization code --
0bcaffd0 movaps xmm5, xmm7
0bcaffd3 movaps xmm7, xmm1
0bcaffd6 addsd xmm7, xmm5
0bcaffda ucomisd xmm7, xmm0
0bcaffde jnb 0x0bca0024 ->5
0bcaffe4 movaps xmm5, xmm7
0bcaffe7 mulsd xmm5, xmm5
0bcaffeb addsd xmm6, xmm5
0bcaffef jmp 0x0bcaffd0 ->LOOP
---- TRACE 1 stop -> loop
This one was particularly uninformed.
- "I'm writing a profiler"
- "I wrote a mini-OS"
- "I had a talk at RustConf"
Other than that, it's really good to see Rust improving.