# pix2pix-tensorflow server
Host pix2pix-tensorflow models to be used with something like the [Image-to-Image Demo](https://affinelayer.com/pixsrv/).
This is a simple python server that serves models exported from `pix2pix.py --mode export`. It can serve local models or use [Cloud ML](https://cloud.google.com/ml/) to run the model.
## Exporting
You can export a model to be served with `--mode export`. As with testing, you should specify the checkpoint to use with `--checkpoint`.
```sh
python ../pix2pix.py \
--mode export \
--output_dir models/facades \
--checkpoint ../facades_train
```
## Local Serving
Using the [pix2pix-tensorflow Docker image](https://hub.docker.com/r/affinelayer/pix2pix-tensorflow/):
```sh
# export a model to upload (if you did not export one above)
python ../tools/dockrun.py python tools/export-example-model.py --output_dir models/example
# process an image with the model using local tensorflow
python ../tools/dockrun.py python tools/process-local.py \
--model_dir models/example \
--input_file static/facades-input.png \
--output_file output.png
# run local server
python ../tools/dockrun.py --port 8000 python serve.py --port 8000 --local_models_dir models
# test the local server
python tools/process-remote.py \
--input_file static/facades-input.png \
--url http://localhost:8000/example \
--output_file output.png
```
If you open [http://localhost:8000/](http://localhost:8000/) in a browser, you should see an interactive demo, though this expects the server to be hosting the exported models available here:
- [edges2shoes](https://mega.nz/#!HtYwAZTY!5tBLYt_6HFj9u2Kxgp4-I36O4EV9r3bDP44ztX3qesI)
- [edges2handbags](https://mega.nz/#!Clg3EaLA!YW2jfRHvwpJn5Elww_wM-f3eRzKiGHLw-F4A3eQCceI)
- [facades](https://mega.nz/#!f1ZjmZoa!mCSxFRxt1WLBpNFsv5raoroEigxomDVpdi40aOG1KMc)
Extract those to the models directory and restart the server to have it host the models.
## Cloud ML Serving
For this you'll want to generate a service account JSON file from https://console.cloud.google.com/iam-admin/serviceaccounts/project (select "Furnish a new private key"). If you are already logged in with the gcloud SDK, the script will auto-detect credentials from that if you leave off the `--credentials` option.
```sh
# upload model to google cloud ml
python ../tools/dockrun.py python tools/upload-model.py \
--bucket your-models-bucket-name-here \
--model_name example \
--model_dir models/example \
--credentials service-account.json
# process an image with the model using google cloud ml
python ../tools/dockrun.py python tools/process-cloud.py \
--model example \
--input_file static/facades-input.png \
--output_file output.png \
--credentials service-account.json
```
## Running serve.py on Google Cloud Platform
Assuming you have gcloud and docker setup:
```sh
export GOOGLE_PROJECT=<project name>
# build image
# make sure models are in a directory called "models" in the current directory
sudo docker build --rm --tag us.gcr.io/$GOOGLE_PROJECT/pix2pix-server .
# test image locally
sudo docker run --publish 8080:8080 --rm --name server us.gcr.io/$GOOGLE_PROJECT/pix2pix-server python -u serve.py \
--port 8080 \
--local_models_dir models
python tools/process-remote.py \
--input_file static/facades-input.png \
--url http://localhost:8080/example \
--output_file output.png
# publish image to private google container repository
python tools/upload-image.py --project $GOOGLE_PROJECT --version v1
# setup server
cp terraform.tfvars.example terraform.tfvars
# edit terraform.tfvars to put your cloud info in there
python ../tools/dockrun.py terraform plan
python ../tools/dockrun.py terraform apply
```
## Full training + exporting + hosting commands
Tested with Python 3.6, Tensorflow 1.0.0, Docker, gcloud, and Terraform (https://www.terraform.io/downloads.html)
```sh
git clone https://github.com/affinelayer/pix2pix-tensorflow.git
cd pix2pix-tensorflow
# get some images (only 2 for testing)
mkdir source
curl -o source/cat1.jpg https://farm5.staticflickr.com/4032/4394955222_eea73818d9_o.jpg
curl -o source/cat2.jpg http://wallpapercave.com/wp/ePMeSmp.jpg
# resize source images
python tools/process.py \
--input_dir source \
--operation resize \
--output_dir resized
# create edges from resized images (uses docker container since compiling the dependencies is annoying)
python tools/dockrun.py python tools/process.py \
--input_dir resized \
--operation edges \
--output_dir edges
# combine resized with edges
python tools/process.py \
--input_dir edges \
--b_dir resized \
--operation combine \
--output_dir combined
# train on images (only 1 epoch for testing)
python pix2pix.py \
--mode train \
--output_dir train \
--max_epochs 1 \
--input_dir combined \
--which_direction AtoB
# export model (creates a version of the model that works with the server in server/serve.py as well as google hosted tensorflow)
python pix2pix.py \
--mode export \
--output_dir server/models/edges2cats_AtoB \
--checkpoint train
# process image locally using exported model
python server/tools/process-local.py \
--model_dir server/models/edges2cats_AtoB \
--input_file edges/cat1.png \
--output_file output.png
# serve model locally
cd server
python serve.py --port 8000 --local_models_dir models
# open http://localhost:8000 in a browser, and scroll to the bottom, you should be able to process an edges2cat image and get a bunch of noise as output
# serve model remotely
export GOOGLE_PROJECT=<project name>
# build image
# make sure models are in a directory called "models" in the current directory
docker build --rm --tag us.gcr.io/$GOOGLE_PROJECT/pix2pix-server .
# test image locally
docker run --publish 8000:8000 --rm --name server us.gcr.io/$GOOGLE_PROJECT/pix2pix-server python -u serve.py \
--port 8000 \
--local_models_dir models
# run this while the above server is running
python tools/process-remote.py \
--input_file static/edges2cats-input.png \
--url http://localhost:8000/edges2cats_AtoB \
--output_file output.png
# publish image to private google container repository
python tools/upload-image.py --project $GOOGLE_PROJECT --version v1
# create a google cloud server
cp terraform.tfvars.example terraform.tfvars
# edit terraform.tfvars to put your cloud info in there
# get the service-account.json from the google cloud console
# make sure GCE is enabled on your account as well
python terraform plan
python terraform apply
# get name of server
gcloud compute instance-groups list-instances pix2pix-manager
# ssh to server
gcloud compute ssh <name of instance here>
# look at the logs (can take awhile to load docker image)
sudo journalctl -f -u pix2pix
# if you have never made an http-server before, apparently you may need this rule
gcloud compute firewall-rules create http-server --allow=tcp:80 --target-tags http-server
# get ip address of load balancer
gcloud compute forwarding-rules list
# open that in the browser, should see the same page you saw locally
# to destroy the GCP resources, use this
terraform destroy
```
没有合适的资源?快使用搜索试试~ 我知道了~
资源推荐
资源详情
资源评论
收起资源包目录
9.pix2pix 模型与自动上色技术.zip (68个子文件)
chapter_9
.gitignore 89B
README.md 2KB
pix2pix.py 35KB
docs
5-tensorflow.png 96KB
1-inputs.png 25KB
1-tensorflow.png 98KB
ab.png 13KB
5-inputs.png 18KB
combine.png 1.31MB
1-torch.jpg 10KB
tensorboard-graph.png 448KB
tensorboard-image.png 347KB
95-inputs.png 35KB
maps.jpg 141KB
test-html.png 4.18MB
cityscapes.jpg 31KB
51-targets.png 98KB
examples.jpg 469KB
51-tensorflow.png 110KB
tensorboard-scalar.png 277KB
95-targets.png 79KB
1-targets.png 99KB
95-torch.jpg 12KB
5-targets.png 93KB
facades.jpg 47KB
edges2shoes.jpg 34KB
95-tensorflow.png 111KB
51-torch.jpg 13KB
edges2handbags.jpg 27KB
51-inputs.png 50KB
5-torch.jpg 8KB
418.png 115KB
server
Dockerfile 853B
README.md 7KB
static
facades-sheet.jpg 1.27MB
facades-input.png 40KB
facades-output.png 117KB
edges2cats-output.png 59KB
edges2cats-sheet.jpg 807KB
edges2cats-input.png 3KB
edges2handbags-input.png 5KB
edges2handbags-output.png 74KB
edges2handbags-sheet.jpg 933KB
editor.png 35KB
edges2shoes-sheet.jpg 677KB
index.html 21KB
edges2shoes-output.png 57KB
edges2shoes-input.png 3KB
deployment.tf 3KB
tools
export-example-model.py 1KB
rolling-update.py 817B
process-cloud.py 2KB
process-local.py 2KB
process-remote.py 766B
upload-image.py 810B
upload-model.py 4KB
terraform.tfvars.example 105B
serve.py 11KB
docker
Dockerfile 4KB
README_eng.md 11KB
LICENSE.txt 1KB
tools
split.py 1KB
dockrun.py 4KB
test.py 4KB
process.py 9KB
process_out_jpg.py 9KB
tfimage.py 3KB
download-dataset.py 645B
共 68 条
- 1
资源评论
- Y丶OU2020-08-17一直给报错。
AndrewCq
- 粉丝: 21
- 资源: 145
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功