Developing Truly offline Tensorflow.JS inferences

Quick writeup on how to serve a object detection model that is truly offline – if this is first time you are developing a object detection, or any vision related application – probably this post is too high-level or abstract for you

Why you would need to do offline inference?
b) When cost of inferences on server is critical and quality with “lite” model is good enough for the task
c) When inference is needed to work offline / disconnected from Cloud

Steps for building a complete offline browser based object detection

1. Step 1 – Assuming you already have a trained a model, I trained mine with GCP automl and exported files. You will typically download- 3 .bin files, 1 dictionary (dict) and a model.JSON file

2. Step 2 – Create a skeleton of your web application. Let’s assume you are using simple HTML page with .JS reference for your detection code. Magic lines of code to detect the, lets say object in an image, are –

There are few options to get the TF library, outlined here –
Typically, Most common and quickest way to reference the TensorFlow library is with script tag – something like.

3. Step 3 – at this time, if everything is working great, you should have your prediction running. However, the real promise of “Truly offline inference” has one problem (and that is why I am writing the post). if you are not connected to web, your page will load just fine since its on the local machine, however, your page will fail to download the package and as a result, inference won’t work. (As a side hack, you only need to download the package ones and subsequent inferences should not need internet connectivity)

4. Step 4 – Making it truly offline: we need to somehow figure out the how to download the tf.min.js and tf-automl.js and package it with the application. If you lookup, the official TensorFlow.js website does not offer direct downloads. At the source, the direct .js file is not available since it is a library is written in Typescript

5. Step 5 – Workaround:
a. Install npm, from here  for windows 10, it will install vc++ re-distributable and python

b. Test your install with npm -v. Then run the following. You are basically packaging and downloading the zip of the tfjs.

npm pack @tensorflow/tfjs
npm pack @tensorflow/tfjs-automl

This should download files – .tgz files. Which is essentially the package file.

c. Extract the files and you should be able to find the tf.min and tf-automl. if you don’t want to get into this hassle, you can find them at my GitHub – remember, the problem with this approach is -these library is constantly getting updated and it makes sense to download the latest periodically


Leave a Reply

Your email address will not be published. Required fields are marked *