Linux prebuilt, we prebuilt binaries for Windows, MacOS and Linux, this will help reduce your CI build time significantly. For extoic OSes, it will fall back to build from source
bs.deriving light config. For people who prefer short names, we make it configurable
type t = {
x : int
} [@@bs.deriving {abstract = light}]let f (obj : t) = obj |. x
This is the last major release which is targeting OCaml 4.02.3, in the future we will make major releases targeting OCaml 4.06.
Happy Hacking!
January 7, 2019
bs-platform 4.0.17 is a major release.
It improved incremental compilation time significantly.
A pictue is worth a thousand words, below is a large monorepo which contains 4096 modules, changing the root node which has more than 3000 dependents, it finished building within 400ms.
We will write a dedicated article explaining how we achieve this incredible build performance.
Another quite important but not client facing change is that we renovated the internal build system, it will be much easier for contribution later on.
We will update the contribution guide once it gets stable.
Today we released version 4.0.8 of bs-platform. A detailed list of changes is available here.
Most user-facing changes are bug fixes and small enhancements, while quite a lot of work has been done behind the scenes towards the more fundamental improvements coming down the line. This blog post refers to the BuckleScript runtime and some of the work we are doing to improve it.
The design goal of BuckleScript is to make the runtime as small as possible. This runtime is divided into two parts: the C shims, and the fundamental language feature support.
The C shims are not a strict runtime requirement: in the native backend, the functions are implemented in C, but in BuckleScript this isn't necessary. We can either implement the C shims in a polyfill style or we can just implement them in OCaml and compile via BuckleScript. Recently, we have been shifting more and more work from the runtime to the normal OCaml stdlib by patching it with conditional compilation. The benefit is obvious – they are just normal functions which do not need special compiler support – but the downside is that we might need to make more patches to the libraries which use C functions, but considering the more challenging part of maintaining the patches to the compiler, we think such overhead is worthwhile.
If we ignore the C shims, the BuckleScript runtime is very small, and it is pretty easy for experienced BuckleScript programmers to write runtime-free code which generates standalone JS code. Such code could include supporting curried calling conventions, encoding of OCaml ADT, etc.
The BuckleScript runtime is written in BuckleScript itself. The benefit of this is that it is much more maintainable than implementing in JS itself, and it is easier to keep some invariants when crossing the boundary between the runtime and the stdlib. For example, we don't need to worry about the consistency of the runtime encoding of type tuple in BuckleScript, since the runtime is also implemented in BuckleScript itself, and we get three output modules for free thanks to this "dogfooding".
However, this makes the build system pretty complicated and fragile, and the dependencies between each module are mostly hard coded. Even worse, this introduces a hard dependency between the normal libraries and the runtime binary artifacts.
In particular, one issue we want to address is to make the BuckleScript toolchain lightweight. We will continue to implement the BuckleScript runtime by using BuckleScript itself, but we want to get rid of dependencies like the support for exceptions. In the end, installation will no longer involve building the runtime: BuckleScript will simply be a bunch of generated JS files, so the complexity of the build system will not impact users at all. This is quite important given that we are committed to supporting Windows.
In the future, we will therefore be able to distribute the runtime as a normal JS library, and the BuckleScript user will only need the binary compiler and a small set of JS files. They will be able to use stdlib, Belt or anything else.
To get rid of such dependencies between stdlib and the runtime, we are going to introduce a breaking change in the future. In hindsight, our support for catching JS exceptions exposed the concrete representation of the exception encoding, in particular:
match ... with
| OCamlExceptionexn -> ..
| Js.Exn.Error e -> ...
In this release, we introduced a function to avoid exposing such exception constructors:
match ... with
| OCamlExceptionexn -> ...
| e ->
matchJs.asJsExn e with
| Some jserror -> ..
| None -> ...
We encourage you to make such changes yourself to future-proof your codebases.
Oh and by the way, one side effect of this refactoring of the BuckleScript runtime is that the compilation does not require reading of the generated .cm* files, which means faster compilation :)
In this article we will explain what we are doing now and what we plan to improve in next half (Dec-May), we would also like to hear your feedback so that we can adjust accordingly.
Keep in mind that the development team is a very small team, so we have to prioritize things instead of working on every feature.
What we are doing
In the last couple of month we are busy upgrading the OCaml compiler from 4.02.3 to 4.06.1, the good news is that the upgrade is almost done.
We plan to ship it soon by the end of this year, at the same time, we will still maintain the current version of the compiler until we feel the new compiler is as good as the old one.
Note the upgrade is not easy work since the internals of OCaml compiler changed significantly in the last few years. Our upgrade strategy is also quite conservative, it works by conditional compilation so that the bsc compiler actually work with both versions, the benefit is that in this case, we are not in a messy state, the bug fix of bsc compiler can still benefit two branches.
But the reward is also huge, there are a bunch of optimizations and nice features coming alone in the recent releases of the OCaml language, to name a few: inline records, local exception and hex notation for floats etc. More importantly, this is a great move to engage better with OCaml ecosystem: the previous old version compiler imposed some maintenance overhead for OCaml toolchain, we can make better use of OCaml toolchain after such upgrade.
What's next
Making better use of the new compiler internals
The upgrade is divided into two stage, the first stage is focused on no regression so that we can ship it.
Afterwards, we have plans to make better use of the more info rich IR in the new release. Thanks to the flambda introduced after OCaml 4.03, more information is passed down to the lambda IR where BuckleScript take from. For example, user can annotate functions inline or not with inline attribute, we can make better use of such information.
In newer versions of OCaml compiler, the block and array is more distinguished we will investigate whether we can make use of it to provide better data representation for OCaml datatype.
Another interesting direction is to see if we can encode module in a consistent style: as an JS dictionary for both global modules and local modules.
Improving the usability of BuckleScript toolchain
BuckleScript is focused on making better use of JS ecosystem and provide values to ship JS code in production (produced by BuckleScript).
Making Bucklescript toolchain more lightweight
Currently it is still too heavy for users to provide JS libraries from BuckleScript, since clients need to install bs-platform which requires a lot of disk-space and native compilation in some platform. We will investigate if we can distribute the native compiler without using npm.
Separate the compiler from stdlib may also help draw the contribution to the stdlib/belt library.
Improving the usability of bsb
Currently the bsb is restricted by the npm directory layout, the generated JS artifacts is also restricted by it, we will see if we can relocate the JS artifacts or provide more flexibility for users.
In the latest BuckleScript release, we introduced a minor change in the codegen which broken some user libraries. Note this change only broke the code in the FFI boundary(the interop between JS).
In the early days of BuckleScript, there is no built-in uncurried calling convention support, since OCaml is a curried language, which means every function has arity one, so there is no way to express that a function has arity zero, this makes some interop challenging. In the mocha unit test library, it expects its callback to be function of arity zero.
To work around this issue, before this release, we did a small codegen optimization, for a function of type unit -> unit, if its argument is not used, we remove its argument in the output.
let f : unit -> int = fun() -> 3let f_used : unit -> unit = fun x -> Js.log x
let f: unit => int = () => 3;
let f_used: unit => unit = x => Js.log(x);
To make this hack work, in the application side,
for a curried function application, we treat the function
of arity 0 and arity 1 in the same way, this still works since
curried function application could only happen on the ocaml function.
This trick is unintuitive, it makes code generated less predictable and it is not relevant any more, since
we added native uncurried calling convention support later.
Therefore, we generate JS code in a more consistent style in this release:
let f : unit -> int = fun() -> 3
let f: unit => int = () => 3;
functionf (param){
return3
}
So in your FFI code, if you have a callback which is expected to be of arity zero, use unit -> unit [@bs] or unit -> unit [@bs.uncurry], it is 100% correct. Note our previous trick will only make unit -> unit work most time, but it can not provide any guarantee.
Since we removed the trick, the curried runtime does not treat function of arity 0 and arity 1 in the same way, so if you have code like this
let f : unit -> int = [%bs.raw {|function() {
return 3
}|}]
let f: unit => int = [%bs.raw {|function () {
return 3
}|}];
It is not correct any more, the fix would be
let f : unit -> int = [%bs.raw{|function(param){
return 3
}|}]
let f: unit => int = [%bs.raw {|function(param) {
return 3
}|}];
Or
let f : unit -> int [@bs] = [%bs.raw{|function(){
return 3
}|}]
let f: (. unit) => int = [%bs.raw {|function() {
return 3
}|}];
FFI is a double edge sword, it is a must have to ship your product, yet it is tricky, and there may be some undefined behavior you rely on but don't recognize, it is encouarged to always test your FFI in the boundary.
bs-platform 4.0.0 introduces a new runtime representation for optionals.
While beforehand None was represented at runtime as 0 and Some("hello") as an array ["hello"], the new representation tries to unbox optionals as much as possible.
Now None is represented as undefined and Some("hello") simply as "hello".
Generally speaking, Some(v) is represented as v, i.e. unboxed. The only exception is when v itself is None or Some(...(Some(None)), in which case a special boxed representation is used.
The construction of new values Some(-), and pattern matching | Some(-) => ..., perform some case analysis to decide when to box or unbox values. In the absence of nested optionals, the result of both operations will always be the indentity.
Because of that, it's possible to use type-based optimization to avoid performing case analysis in the first place.
So while the generic function (x) => Some(x) will generate code to check wheter x should be boxed, the more type-specific function (x:int) => Some(x) is just compiled as the identity function, as it's clear from the type that no boxing is required.
For a high-level formalization of the boxing and unboxing operations, as well as the polymorphic comparison functions, see this gist.
One design choice was whether to represent None as null or as undefined. The choice of undefined was made because this allows a direct mapping for optional labeled arguments. As a consequence null is never boxed, so e.g. Some(null) is represented as null.
bs-platform 4.0.0 is released! It has some nice features that we want to share with you, a more detailed list of changes is available here
In this post, I will talk about a new development workflow, all toolchains are self-contained in bs-platform, Cristiano will talk about the new runtime encoding for optional.
A simple approach to accelerate feedback loop in a reliable way
For mordern day-to-day development, developers expect that whenever files are changed, the build process is re-triggered automatically and browser reloaded instantly, this feedback loop should be quick enough to make developers not get distracted.
What we have before this release is as below:
Source file changes detected by bsb watch mode, rebuild
Webpack noticed JS files modified, rebundle and update the browser state
Both bsb and webpack has a watch mode, but they are architectured in fundamentally different ways that they achive different levels of reliability.
In bsb watch mode, there is no long running memory-hungry process, so whenever a file changed, a fresh process is started very fast and dies quickly, our experiment shows that in practice, a long running bsb process can work for a week without going into bad state, and such feedback loop is still instant.
Webpack holds lots of objects in memory and running for a longtime, it results in less reliability and OOM from time to time.
Another complexity introduced by a JS bundler is that it explodes users directory structure, for beginners trying to get started with bucklescript, installing such huge amount of directories is intimidating. In a slow network, this used to result in installation failure.
We understand that existing JS bundler has a huge ecosystem and it is invaluable in production mode, but we are exploring whether we can provide similar or even more reliable development experience without introducing such complexity.
Below is a new workflow we are exploring in this release:
NodeJS module loader in browser
Instead of bundling the modules like normal bundlers, we provide a NodeJS module loader so that it simply reloads the module without bundling.
Note ideally this can be achieved using ES6 module spec, however, it is not practical due to following reasons:
Most dependencies are not strictly ES6 compilant, this is true even for libraries authored in ES6 style
import {createElement} form "react" ; // not es6 compliantimport {createElement} from"node_modules/react/index"// not es6 compilantimport {createElement} from"./node_modules/react/index.js"// correct es6 module
ES6 modules does not allow an indirection, by introducing our own NodeJS module loader, we have an indirection and more meta-data about each module, so that we can do more reflection work in the future.
Loading in clean state without packing seems to introduce some redundant work, but on the contratry, it is very fast, it used to load 200 modules under 150ms, even better, since there is no cached state in a long running process, it is much more reliable.
WebSocket integration with bsb
We need a mechanism to communciate between browser and the build system so that whenever a rebuild finished, the browser get notified.
Instead of introducing more dependencies, we implemented a minimal websocket interface so that whenever a rebuild finishes, the weboscket clients which subscribe the port will get notified.
To conclude and try it out
So the proposed new work flow is as below: whenever a source file is changed, the bsb rebuild, if it build successfully, it will notify the browser to reload the NodeJS modules directly.
All the devtools are provided by bs-platform, the good thing is that there is no long running memory-hungry process, so that we expect it will deliver a more reliable and consistent experience.
You can try it out in bs-platform@4.0.0
bsb -init test -theme react-lite
cdtest
npm install
npm start
http-server # start a http server
open localhost:port/index.html, changes the reason source code and expect the browser show the changes.
Hey again! The release two days ago removed the deprecated Js Boolean APIs (no longer needed since we compile OCaml booleans to JS boolean since 3.0.0). But folks have voiced that the removal was too hasty, as some of their dependencies still haven't upgraded to 3.0.0 and thus still needed the deprecated APIs.
We try to be diligent with our releases; hopefully this didn't churn too many people. To remediate the situation, we're putting those calls back for this version. Finger crossed that you don't have to wait on too many dependencies!
Sorry for the small churn, and thanks for all your feedback!
Since BuckleScript 3.0, OCaml bool now compile to JS boolean. It was deprecated (all the boolean conversion functions became no-ops, with warnings during build), and now completely removed. No more need for the converter functions!
bs-platform 3.0.0 is released! Go get it. This is a great release.
Highlighted features:
OCaml/Reason boolean are finally compiled as JS boolean! Due to historical limitations, OCaml true/false was compiled to 1/0 in JS. This caused quite a bit of confusion for newcomers. It now compiles to JS true/false. Special thanks to Cristiano for all the hard work.
New object type feature. This is an experimental and potentially much better way to bind to JS objects that potentially obsoletes the need for a few other APIs. Please see the linked docs and help us test it!
raw now accepts a function declaration with an unsafe string body: let f = [%raw (a, b) => "return a + b"] (OCaml syntax: let f = [%raw fun a b -> "return a + b"]). This makes embedding escape-hatch raw JS code even easier for the compiler to optimize for speed and readability, as you've indicated that the raw code block is a function, with specific numbers of arguments.
We've been working on the BuckleScript compiler for almost four years now; meanwhile, the OCaml type checker itself has already been engineered for almost three decades. After all this of work, we believe that BuckleScript has reached a stable and reliable stage.
Below is a list of to-dos that we will work on in the future. Suggestions welcome!
Upgrade the OCaml version. OCaml is quite a stable language; there are not too many changes between BuckleScript's OCaml version and latest stable one. Nonetheless, it's good to keep up with the OCaml ecosystem.
A uniform representation for local modules/global modules. Currently local modules are compiled to array, while global modules are compiled to ES6/CommonJS/AMD modules (the cost of local modules is low though, thanks to aggressive inlining).
Continue improving Belt. Some initial nice numbers here.
Enhance FFI to allow creation of idiomatic, type safe JS classes.
Introduce a debug mode to enhance the printing of OCaml data structures.
Performance. The compiler performance and generation of more performant and readable code is always our top concern.