Skip to main content
Filter by
Sorted by
Tagged with
2 votes
1 answer
69 views

Deploy TPU TF Serving Model to AWS SageMaker

I have a couple of pre-trained and tested TensorFlow LSTM models, which have been trained on Google Colab. I want to deploy these models with AWS as our entire application is deployed there. I've ...
Manu Sisko's user avatar
0 votes
0 answers
44 views

Building tensorflow serving failed wth following bazel errors

Building tensorflow serving from source failed wth following bazel errors Don't get related issue or answer from google. $ bazel build -c opt //tensorflow_serving/model_servers:tensorflow_model_server ...
user1846476's user avatar
0 votes
0 answers
19 views

Can't setup specific parameters of server_core through model_config_list

Im having an infinite loop when model_paths are not correct in model_config_list for tf-serving. This seems to be a known behaviour for tf-serving and it also seems to be two different ways of ...
Sergiodiaz53's user avatar
  • 1,288
1 vote
1 answer
123 views

How to compile tensorflow serving (tensorflow/xla) to have llvm/mlir as shared objects rather than statically included in the binary?

I am trying to compile the tensorflow serving project and I would like to have llvm/mlir compiled as a shared objects. The project is tensorflow serving -> tensorflow -> xla and compiles to a ...
Capybara's user avatar
  • 1,483
1 vote
0 answers
66 views

Tensorflow serving keras bert model issue

I am trying to use tensorflow serving to serve a keras bert model, but I have problem to predict with rest api, below are informations. Can you please help me to resolve this problem. predict output (...
cceasy's user avatar
  • 303
0 votes
0 answers
37 views

Why does the warmup process use only 1 CPU core when loading a model in TensorFlow Serving? How can this be fixed?

We are facing an issue where loading a new version of a model in TensorFlow Serving takes a long time during the warmup process (in our case, 10 minutes). This is a problem for us. While investigating ...
Bakai Zhamgyrchiev's user avatar
0 votes
0 answers
46 views

Tensorflow consumes both GPU and CPU Memory

I have TensorFlow set up with GPU enabled on Debian. Upon using tensorflow.keras.models.load_model to load a model, I noticed that it utilizesoth GPU memory and CPU memory (the system's RAM). I'm ...
Masoud's user avatar
  • 1,351
0 votes
0 answers
48 views

How to check which models tensorflow serving located in the docker container is capable of serving?

I want to use newer model in the tensorflow serving docker container (tensorflow/serving:latest image). Last time this container was initialized more than 2 years ago. So tensorflow serving might be a ...
Eddudos Kim's user avatar
0 votes
0 answers
19 views

Compute specs required to build the tensorflow serving docker image

I'm trying to build the Tensorflow Serving docker image from scratch. Currently, I'm burning every CPU I can attempt to spin up. There is no official documentation regarding this. Does anyone have any ...
Bilaal Rashid's user avatar
0 votes
0 answers
42 views

Issue: StatusCode.FAILED_PRECONDITION

I try to serve model on TFserver through docker tensorflow/serving:2.16.1 and got this issue: `<_InactiveRpcError of RPC that terminated with: status = StatusCode.FAILED_PRECONDITION details = &...
Nhựt Tiến's user avatar
0 votes
0 answers
38 views

Tensorflow Serving: Adding warm start data at runtime

I'm trying to dynamically add warm-start data for our models via the SavedModel Warmup method (https://www.tensorflow.org/tfx/serving/saved_model_warmup). In our case, we need to actually have the ...
trevoryao's user avatar
  • 101
0 votes
0 answers
30 views

Multiple Model Configuration in tensorflow serving

i have created a file model.config for configuration detail and it is inside model_config and model_config is inside untitlled folder from where iam executing my script and getting the path not ...
Sawan Rawat's user avatar
0 votes
0 answers
27 views

Handle label feature in TFX in different environments

I'm new in MLOps and trying to figure out how to work with label feature in data. I read that for the uniformity of the data it is necessary to use the same schema for both the training and validation ...
AnnacKK's user avatar
0 votes
0 answers
74 views

Ragged Tensor as an output from Tensorflow serving

We use tensorflow serving to serve models in production. We have a use case where the output of the model is a ragged tensor. To see if the tensorflow serving supports ragged tensor as output, we ...
Ritesh's user avatar
  • 497
0 votes
1 answer
23 views

Is there another loss that can replace seq2seq.sequence_loss in tensorflow

I am running a CVAE for text generation. I am using tensorflow > 2.0. The problem is that for my loss I use the seq2seq.sequence_loss. I tried to update the tensorflow v1 to v2 since the code was ...
svmmy_776's user avatar

15 30 50 per page
1
2 3 4 5
84