Does inception_v3 use dilations? - tensorflow

I'd like to ask whether the inception_v3 model uses dilations as I am planning to run my model for inference on a server but this server only has tensorflow version 1.3 installed. This version isn't compatible with said dilations so I'd like to make sure my model would work on the server.

No inception_v3 does not use dilations and is compatible with TF 1.3 and up however Mobilenet requires TF 1.5 and up.

Related

Is there any repository where i can get pretrained-ssd model for tensorflow 1.1 version?

I'm trying to convert tensorflow model to TIDL model using TIDL importing tool.
The TIDL importing tool supports only layers in Tensorflow 1.1 version.
So, i need a pretrained model that is trained in Tensorflow 1.1 version.
Is there any repository like git, or something?
Or any method to convert model trained in recent Tensroflow version to older version?
And How can i convert some nodes in pb file when i make frozen graph? For example, i want to change from NonMaxSuppressionv2 to NonMaxSuppression. Is there any suggestion or example code?

How to get version of tensorflow object detection API

I have used Tensorflow object detection API. Recently I upgrade my Tensorflow version and some incompatibility issues happen. for this reason I need to know my Tensorflow object detection API version to know the compatible version of Tensorflow to install it again.
Please report the commit SHA of the tensorflow/models git repo (you can find this by running git log).

Converting Tensorflow 1.1 model into Tensorflow Lite

I want to convert my tensorflow 1.1 based model into tensorflow lite in order to serve the model locally and remotely for a PWA. The official guide only offers Python APIs for 1.11 at the earliest. Command line tools only seem to work starting at 1.7. Is it possible to convert a 1.1 model to tensorflow lite? Has anyone had experience with this?
The tf module is an out-of-the-box pre-trained model using BIDAF. I am having difficulty serving the full tf app on Heroku, which is unable to run it. I would like to try a tf lite app to see if hosting it locally will make it faster, and easier to set up as a PWA.

How to convert keras model to corml format using coremltools

I want to convert my keras model to coreml using coremltools.
When I try to do this, it gives me an error
ImportError: cannot import name 'relu6'
My tensorflow version is 1.5.1
My keras version is 2.1.6
The complete colab file is here:
https://colab.research.google.com/drive/1kSeErLsp_xaU37haUrwBO5jiNlV2RCll
I have already tried different versions of the modules but I am ready to try a new version I haven't tried
It looks like your installation still tries to use Keras 2.2.0, since the error in coremltools happens after it checks that the Keras version >= 2.2.0.
Write keras.__version__ to see what version of Keras your notebook is really using.
Try installing an older version of keras_applications, one that still has the relu6 function. It was recently changed. The problem with Keras is that stuff often moves around between minor versions.

TensorFlow and KenLM

How does one use KenLM with tensorflow as decoder?
I know about tensorflow-with-kenlm tf fork, but it is based on 1.1 tf version which doesn't have many important features for my project.

Resources