Gunicorn: Docker μ»¨ν…Œμ΄λ„ˆμ˜ Gunicorn + Flask + Tensorflowκ°€ μž‘λ™ν•˜μ§€ μ•ŠμŒ

에 λ§Œλ“  2019λ…„ 10μ›” 03일  Β·  23μ½”λ©˜νŠΈ  Β·  좜처: benoitc/gunicorn

μ•ˆλ…•ν•˜μ‹­λ‹ˆκΉŒ

μ €λŠ” μ•žμ— μž‘μ€ Flask APIκ°€ μžˆλŠ” TensorFlow 2.0 ν”„λ‘œμ νŠΈκ°€ μžˆμœΌλ―€λ‘œ APIμ—μ„œ 이미 μˆ˜ν–‰λœ 데이터 사전 처리둜 HTTP ν˜ΈμΆœμ„ 톡해 λͺ¨λΈμ— μš”μ²­ν•  수 μžˆμŠ΅λ‹ˆλ‹€. 도컀 μ»¨ν…Œμ΄λ„ˆμ—μ„œ Flask/TensorFlow μ• ν”Œλ¦¬μΌ€μ΄μ…˜μ„ μ‹€ν–‰ν•˜κΈ° μœ„ν•΄ Gunicorn을 μ„ νƒν–ˆμŠ΅λ‹ˆλ‹€. μŠ¬ν”„κ²Œλ„ Gunicorn이 λ§Œλ“œλŠ” μž‘μ—…μž ν”„λ‘œμ„ΈμŠ€λŠ” Gunicorn에 μ˜ν•΄ 죽을 λ•ŒκΉŒμ§€ μ»¨ν…Œμ΄λ„ˆμ— 맀달렀 μžˆμŠ΅λ‹ˆλ‹€. μ„œλ²„κ°€ μ˜€μ§€ μ•Šκ³  μš”μ²­μ„ ν•  수 μ—†μŠ΅λ‹ˆλ‹€. λ˜ν•œ λ™μΌν•œ Gunicorn 섀정이 λ‚΄ 호슀트 μ»΄ν“¨ν„°μ˜ 도컀 μ™ΈλΆ€μ—μ„œ μ™„λ²½ν•˜κ²Œ μž‘λ™ν•©λ‹ˆλ‹€.

Docker 둜그(κ·Έλƒ₯ 거기에 λ©ˆμΆ”κ³  였랜 μ‹œκ°„μ΄ μ§€λ‚˜λ©΄ μ‹œκ°„ 초과 였λ₯˜κ°€ 인쇄됨)

[2019-10-03 18:03:05 +0000] [1] [INFO] Starting gunicorn 19.9.0
[2019-10-03 18:03:05 +0000] [1] [INFO] Listening at: http://127.0.0.1:8000 (1)
[2019-10-03 18:03:05 +0000] [1] [INFO] Using worker: sync
[2019-10-03 18:03:05 +0000] [8] [INFO] Booting worker with pid: 8
2019-10-03 18:03:08.126584: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2019-10-03 18:03:08.130017: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 3392000000 Hz
2019-10-03 18:03:08.130306: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x55fbb23fb2d0 executing computations on platform Host. Devices:
2019-10-03 18:03:08.130365: I tensorflow/compiler/xla/service/service.cc:175]   StreamExecutor device (0): Host, Default Version

도컀 파일:

FROM python

RUN pip install gunicorn

WORKDIR /usr/src/app

COPY requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt

COPY . .

EXPOSE 8000

CMD [ "gunicorn", "--chdir", "src", "api:app" ]

api.py:

from flask import Flask, request
import inference

app = Flask(__name__)

@app.route('/', methods=['GET', 'POST'])
def predict():
    if request.method == 'GET':
        return 'POST a json payload of {"imageBase64": "base64base64base64"} to this address to predict.'
    try:
        result = inference.run(request.json['imageBase64'])
        return result
    except Exception as e:
        return {'error': str(e)}, 500

if __name__ == "__main__":
    app.run()
else:
    print('\n * Server ready!')

inference.py

# Import packages
from __future__ import absolute_import, division, print_function, unicode_literals

import os
import tensorflow as tf
from tensorflow import keras
import PIL
import numpy as np
from io import BytesIO
import base64
import json

print("TensorFlow version is ", tf.__version__)

# Set variables
##########################################################################################
##########################################################################################

model_name = 'catsdogs'

base_dir = os.path.join(os.path.dirname(__file__), '..')
model_dir = os.path.join(base_dir, 'models')

##########################################################################################
##########################################################################################

# Load model
model = keras.models.load_model(os.path.join(model_dir, model_name + '.h5'))

# Load metadata
with open(os.path.join(model_dir, model_name + '_metadata.json')) as metadataFile:
    metadata = json.load(metadataFile)

# Split metadata
labels = metadata['training_labels']
image_size = metadata['image_size']

# Exported function for inference
def run(imgBase64):
    # Decode the base64 string
    image = PIL.Image.open(BytesIO(base64.b64decode(imgBase64)))

    # Pepare image
    image = image.resize((image_size, image_size), resample=PIL.Image.BILINEAR)
    image = image.convert("RGB")

    # Run prediction
    tensor = tf.cast(np.array(image), tf.float32) / 255.
    tensor = tf.expand_dims(tensor, 0, name=None)
    result = model.predict(tensor, steps=1)

    # Combine result with labels
    labeledResult = {}
    for i, label in enumerate(labels):
        labeledResult[label] = float(result[0][labels[label]])

    return labeledResult

λ‚˜λŠ” μ˜€λž«λ™μ•ˆ 이것에 λŒ€ν•œ 해결책을 μ°Ύμ•˜κ³  아무 것도 생각해 내지 λͺ»ν–ˆμŠ΅λ‹ˆλ‹€. μ–΄λ–€ 도움이라도 λŒ€λ‹¨νžˆ κ°μ‚¬ν•˜κ² μŠ΅λ‹ˆλ‹€.

감사 ν•΄μš”!

Feedback Requested FeaturWorker FeaturIPC PlatforDocker

κ°€μž₯ μœ μš©ν•œ λŒ“κΈ€

같은 λ¬Έμ œκ°€μžˆμ—ˆμŠ΅λ‹ˆλ‹€. μ§€κΈˆκΉŒμ§€ λ‚΄κ°€ λ‚΄ μžμ‹ μ˜ λ‘œκ·Έμ—μ„œ μΆ”μΈ‘ ν•  수, κ·Έκ²ƒμ€μ²˜λŸΌ λ³΄μ΄λŠ” tensorflow μ‚¬μš© gevent , 당신은 μ‚¬μš©ν•  수 μ—†μŠ΅λ‹ˆλ‹€ gevent μ—μ„œ 같은 μ‹œκ°„μ— gunicorn . --workers 및 --threads ν”Œλž˜κ·ΈλŠ” μ•„λ¬΄λŸ° 차이가 μ—†μ§€λ§Œ --worker-class=gevent μ—μ„œ --worker-class=gthread ν•˜λ©΄ λ¬Έμ œκ°€ ν•΄κ²°λ˜μ—ˆμŠ΅λ‹ˆλ‹€. @javabrett 감사

λͺ¨λ“  23 λŒ“κΈ€

Docker 섀정이 μ»¨ν…Œμ΄λ„ˆμ— μ‚¬μš© κ°€λŠ₯ν•œ μ΅œλŒ€ λ©”λͺ¨λ¦¬λ₯Ό μ œν•œν•©λ‹ˆκΉŒ?

같은 κ²½ν—˜. λ‚˜λŠ” Gunicorn이 λΉ„λ‚œλ°›μ„ 것이라고 μƒκ°ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€. μ»¨ν…Œμ΄λ„ˆμ˜ bash μ…Έμ—μ„œ python3 api.py λ₯Ό μ‹€ν–‰ν•  λ•Œλ„ λ™μΌν•œ 였λ₯˜κ°€ λ°œμƒν•©λ‹ˆλ‹€.

@tlaanemaa @mackdelanyκ°€ λ§ν•œ 것을 확인할 수 μžˆμŠ΅λ‹ˆκΉŒ?

이봐. μ΄λ ‡κ²Œ μ‚¬λΌμ Έμ„œ μ£„μ†‘ν•©λ‹ˆλ‹€.

λ‚΄ 섀정은 Docker의 RAM을 μ•½κ°„ μ œν•œν•˜κ³  μžˆμ§€λ§Œ μ œν•œμ„ μ œκ±°ν•΄λ„ λ™μΌν•œ 일이 λ°œμƒν–ˆμŠ΅λ‹ˆλ‹€.

gunicorn 없이 api νŒŒμΌμ„ 싀행해보고 λ‹€μ‹œ λ³΄κ³ ν•˜κ² μŠ΅λ‹ˆλ‹€.

감사 ν•΄μš”!

@tlaanemaa 그것에 λŒ€ν•œ μ†Œμ‹μ΄ μžˆμŠ΅λ‹ˆκΉŒ?

@benoitc ν—€μ•Ό
μ£„μ†‘ν•©λ‹ˆλ‹€. μ €λŠ” λ‹€λ₯Έ 일에 λͺ°λ‘ν•΄ μžˆμ–΄μ„œ 이것에 λŒ€ν•΄ 더 이야기할 μ‹œκ°„μ΄ μ—†μ—ˆμŠ΅λ‹ˆλ‹€.
λ‚˜λŠ” 였늘 이것을 찌λ₯΄λ €κ³  λ…Έλ ₯ν•˜κ³  λ‹Ήμ‹ μ—κ²Œ λŒμ•„μ˜¬ κ²ƒμž…λ‹ˆλ‹€

κ·Έλž˜μ„œ μ»¨ν…Œμ΄λ„ˆμ— gunicorn 없이 앱을 μ‹€ν–‰ν•΄ λ³΄μ•˜λŠ”λ° νš¨κ³Όκ°€ μžˆμ—ˆμŠ΅λ‹ˆλ‹€.
μ•„λž˜λŠ” λ‚΄ Dockerfile의 CMD λΉ„νŠΈμž…λ‹ˆλ‹€.

곡μž₯:

CMD [ "python", "src/api.py" ]

둜그:

2019-12-02 11:40:45.649503: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2019-12-02 11:40:45.653496: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2208000000 Hz
2019-12-02 11:40:45.653999: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x55f969cf6a40 executing computations on platform Host. Devices:
2019-12-02 11:40:45.654045: I tensorflow/compiler/xla/service/service.cc:175]   StreamExecutor device (0): Host, Default Version
TensorFlow version is  2.0.0
 * Serving Flask app "api" (lazy loading)
 * Environment: production
   WARNING: This is a development server. Do not use it in a production deployment.
   Use a production WSGI server instead.
 * Debug mode: off
 * Running on http://127.0.0.1:5000/ (Press CTRL+C to quit)

μž‘λ™ν•˜μ§€ μ•ŠμŒ:

CMD [ "gunicorn", "--chdir", "src", "api:app" ]

둜그:

[2019-12-02 11:39:22 +0000] [1] [INFO] Starting gunicorn 20.0.4
[2019-12-02 11:39:22 +0000] [1] [INFO] Listening at: http://127.0.0.1:8000 (1)
[2019-12-02 11:39:22 +0000] [1] [INFO] Using worker: sync
[2019-12-02 11:39:22 +0000] [9] [INFO] Booting worker with pid: 9
2019-12-02 11:39:24.041188: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2019-12-02 11:39:24.046495: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2208000000 Hz
2019-12-02 11:39:24.047129: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x5623e18b5200 executing computations on platform Host. Devices:
2019-12-02 11:39:24.047183: I tensorflow/compiler/xla/service/service.cc:175]   StreamExecutor device (0): Host, Default Version

λ˜ν•œ μ›ν•˜λŠ” 경우 λ‘˜λŸ¬λ³Ό 수 μžˆλ„λ‘ μ €μž₯μ†Œλ₯Ό μ—΄μ—ˆμŠ΅λ‹ˆλ‹€.
도움이 될 수 μžˆμŠ΅λ‹ˆλ‹€

https://gitlab.com/tlaanemaa/image-classifier

Listening at: http://127.0.0.1:8000 (1)

λ¬Έμ œλŠ” gunicorn이 μ»¨ν…Œμ΄λ„ˆ λ‚΄λΆ€μ˜ localhostλ₯Ό μˆ˜μ‹  λŒ€κΈ°ν•˜λ―€λ‘œ μ™ΈλΆ€μ—μ„œ μ—°κ²°ν•  수 μ—†λ‹€λŠ” κ²ƒμž…λ‹ˆκΉŒ?

ν”ŒλΌμŠ€ν¬ 앱이 λ™μΌν•œ μž‘μ—…μ„ μˆ˜ν–‰ν•˜κ³  μž‘λ™ν–ˆκΈ° λ•Œλ¬Έμ— κ·Έλ ‡κ²Œ μƒκ°ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.
λ˜ν•œ gunicorn 버전은 tensorflow 버전을 κΈ°λ‘ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€. μ΄λŠ” μ½”λ“œμ˜ ν•΄λ‹Ή 둜그 ν–‰μ—μ„œ λ¬Έμ œκ°€ λ°œμƒν•¨μ„ μ•”μ‹œν•©λ‹ˆλ‹€. gunicorn 없이 μ‹€ν–‰ν•  λ•Œ ν”ŒλΌμŠ€ν¬λ§Œ μ‹€ν–‰ν•˜λ©΄ κΈ°λ‘λ©λ‹ˆλ‹€.
TensorFlow version is 2.0.0

디버그 μˆ˜μ€€μ—μ„œ 무엇을 λ§ν•©λ‹ˆκΉŒ?

@tlaanemaa Docker 데λͺ¬ λ„€νŠΈμ›Œν‚Ήμ΄ μ–΄λ–»κ²Œ κ΅¬μ„±λ˜μ–΄ μžˆμŠ΅λ‹ˆκΉŒ? @CaselIT의 μ˜κ²¬μ— λ”°λ₯΄λ©΄ ν΄λΌμ΄μ–ΈνŠΈκ°€ Docker λ„€νŠΈμ›Œν¬λ₯Ό 톡해 Gunicorn ν¬νŠΈμ— μ—°κ²°ν•  수 μ—†λŠ” 것 κ°™μŠ΅λ‹ˆλ‹€.

-b 0.0.0.0:8000 인수둜 Gunicorn을 μ‹œμž‘ν•΄ λ³Ό 수 μžˆμŠ΅λ‹ˆκΉŒ?

λ‚˜λŠ” λ¬Έμ œκ°€ λ„€νŠΈμ›Œν¬μ— μžˆλ‹€κ³  μƒκ°ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€. 적어도 λ‘œκ·Έμ—μ„œ tensorflow κ°€μ Έμ˜€κΈ° 후에 μ˜€λŠ” 둜그 라인에 λ„λ‹¬ν•˜μ§€ μ•ŠκΈ° λ•Œλ¬Έμ— μ„œλ²„κ°€ μ „ν˜€ μ‹œμž‘λ˜μ§€ μ•ŠλŠ” κ²ƒμ²˜λŸΌ 보이기 λ•Œλ¬Έμž…λ‹ˆλ‹€.

κ·ΈλŸΌμ—λ„ λΆˆκ΅¬ν•˜κ³  λ‚˜λŠ” λ‹Ήμ‹ μ˜ μ œμ•ˆμ„ μ‹œλ„ν–ˆμ§€λ§Œ 그것은 λ‚˜μ—κ²Œ 였λ₯˜λ₯Ό μ œκ³΅ν•©λ‹ˆλ‹€

CMD [ "gunicorn", "-b", "0.0.0.0:8000", "--chdir", "src", "api:app" ]

_ν†΅λ‚˜λ¬΄_

usage: gunicorn [OPTIONS] [APP_MODULE]
gunicorn: error: unrecognized arguments: -d

직접 μ‚¬μš©ν•΄λ³΄κ³  μ‹Άλ‹€λ©΄ Registry.gitlab.com/tlaanemaa/image-classifierμ—μ„œ μ»¨ν…Œμ΄λ„ˆ 이미지λ₯Ό μ‚¬μš©ν•  수 μžˆμŠ΅λ‹ˆλ‹€.

@tlaanemaa μ—…λ°μ΄νŠΈλœ Dockerfile , 이미지 λΉŒλ“œ λͺ…λ Ή 및 μ»¨ν…Œμ΄λ„ˆ μ‹€ν–‰ λͺ…령을 λ‹€μ‹œ κ²Œμ‹œν•  수 μžˆμŠ΅λ‹ˆκΉŒ?

@javabrett ν™•μ‹€νžˆ

  • λ„μ»€νŒŒμΌ: https://gitlab.com/tlaanemaa/image-classifier/blob/master/Dockerfile
  • λΉŒλ“œ λͺ…λ Ή: docker build -t tlaanemaa/image-classifier .
  • ContainerλŠ” Porttainerλ₯Ό 톡해 μ‹€ν–‰λ˜λ©° μŠ¬ν”„κ²Œλ„ μ–΄λ–€ λͺ…령을 μ‚¬μš©ν•˜λŠ”μ§€ 잘 λͺ¨λ₯΄κ² μŠ΅λ‹ˆλ‹€. κ±°κΈ°μ—μ„œ 아무 미친 일도 μΌμ–΄λ‚˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€. ν‘œμ€€ ν•­λͺ©, 포트 8000이 μ „λ‹¬λ©λ‹ˆλ‹€.

_κ²Œμ‹œ λ‹Ήμ‹œμ˜ Dockerfile:_

FROM python:3.7

RUN pip install gunicorn

WORKDIR /usr/src/app

COPY requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt

COPY . .

EXPOSE 8000

CMD [ "gunicorn", "-b", "0.0.0.0:8000", "--chdir", "src", "api:app" ]

docker의 전체 λ‘œκ·ΈλŠ” λ¬΄μ—‡μž…λ‹ˆκΉŒ? λ§ˆμ§€λ§‰μœΌλ‘œ μ‚¬μš© 쀑인 λͺ…령쀄을 뢙여넣을 수 μžˆμŠ΅λ‹ˆκΉŒ?

이 문제λ₯Ό λ””λ²„κ·Έν•˜λŠ” λ™μ•ˆ ν”Όν•  수 μ—†λŠ” μž‘μ—…μ„ μˆ˜ν–‰ν•˜μ§€ μ•ŠλŠ” ν•œ μ§€κΈˆμ€ Porttainer 없이 μ‹€ν–‰ν•  수 μžˆμŠ΅λ‹ˆκΉŒ?

이것은 Mac용 Docker Desktop 2.1.0.5μ—μ„œ μž‘λ™ν•©λ‹ˆλ‹€.

docker build -t tlaanemaa/image-classifier .
docker run -it --rm -p 8000:8000 tlaanemaa/image-classifier

POST μš”μ²­μ„ μˆ˜λ½ν•©λ‹ˆλ‹€.

전체 좜λ ₯κ³Ό κ²°κ³Όλ₯Ό μ‹€ν–‰ν•˜κ³  κ²Œμ‹œν•˜μ‹­μ‹œμ˜€.

λ‚˜λŠ” 그것을 μ‹œλ„ν•˜κ³  μ§€κΈˆ μž‘λ™ν•©λ‹ˆλ‹€.
-b ν”Œλž˜κ·Έκ°€ 문제λ₯Ό ν•΄κ²°ν–ˆμ„ 수 μžˆμŠ΅λ‹ˆκΉŒ?

정말 κ°μ‚¬ν•©λ‹ˆλ‹€!

μ§€κΈˆ ν₯미둜운 점은 POST μš”μ²­μ„ ν•  λ•Œ thposeλŠ” λΉ λ₯΄μ§€λ§Œ GET μš”μ²­μ€ 맀우 λŠλ¦¬λ‹€λŠ” κ²ƒμž…λ‹ˆλ‹€. μž μ‹œ λ™μ•ˆ GET μš”μ²­μ„ μˆ˜ν–‰ν•˜λ©΄ μ΄λŸ¬ν•œ μš”μ²­μ€ λΉ¨λΌμ§€μ§€λ§Œ POSTλŠ” 맀우 λŠλ €μ§€κ³  μž‘μ—…μž μ‹œκ°„μ΄ μ΄ˆκ³Όλ©λ‹ˆλ‹€. ν•΄λ‹Ή POST에 μ‘λ‹΅ν•˜λ©΄ POSTλŠ” λ‹€μ‹œ λΉ λ₯΄κ³  GET은 λŠλ¦½λ‹ˆλ‹€. λΉ λ₯΄κ²Œ 1회λ₯Ό ν•  수 μžˆμ„ 것 κ°™κ³  μ „ν™˜ν•˜λŠ” 데 μ‹œκ°„μ΄ κ±Έλ¦½λ‹ˆλ‹€ :D

λ‹€μŒμ€ μž‘μ—…μž μ‹œκ°„ 초과둜 인해 GET이 λΉ λ₯΄κ³  POSTκ°€ 느린 경우의 λ‘œκ·Έμž…λ‹ˆλ‹€.

[2020-01-10 09:34:46 +0000] [1] [CRITICAL] WORKER TIMEOUT (pid:72)
[2020-01-10 09:34:46 +0000] [72] [INFO] Worker exiting (pid: 72)
[2020-01-10 09:34:47 +0000] [131] [INFO] Booting worker with pid: 131
TensorFlow version is  2.0.0
2020-01-10 09:34:48.946351: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2020-01-10 09:34:48.951124: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 2208000000 Hz
2020-01-10 09:34:48.951612: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x56481dbabd80 executing computations on platform Host. Devices:
2020-01-10 09:34:48.951665: I tensorflow/compiler/xla/service/service.cc:175]   StreamExecutor device (0): Host, Default Version

 * Server ready!

λ˜ν•œ 일뢀 μƒν™©μ—μ„œλŠ” * Server ready! λ‘œκ·Έκ°€ 도컀 λ‘œκ·Έμ—μ„œ λ‚˜νƒ€λ‚˜μ§€ μ•ŠλŠ” 것 κ°™μŠ΅λ‹ˆλ‹€. 그것도 μ˜€ν•΄μ˜ μ†Œμ§€κ°€ μžˆμ—ˆμ„ 수 μžˆμŠ΅λ‹ˆλ‹€. κ·Έ 원인이 무엇인지 ν™•μ‹€ν•˜μ§€ μ•ŠμŠ΅λ‹ˆλ‹€.

Docker의 ν˜„μž¬ μ„œλ²„λŠ” 단일/동기화 μŠ€λ ˆλ“œλ‘œ κ΅¬μ„±λ˜μ–΄ μ‚¬μš© 쀑/μ°¨λ‹¨ν•˜κΈ°κ°€ 쉽지 μ•ŠμœΌλ―€λ‘œ 이λ₯Ό λ³Ό 수 μžˆμŠ΅λ‹ˆλ‹€. --workers=2 --threads=4 --worker-class=gthread 와 같은 인수λ₯Ό μΆ”κ°€ν•΄ λ³΄μ„Έμš”.

@javabrett κ°μ‚¬ν•©λ‹ˆλ‹€
ν•΄κ²°ν–ˆμŠ΅λ‹ˆλ‹€!

같은 λ¬Έμ œκ°€μžˆμ—ˆμŠ΅λ‹ˆλ‹€. μ§€κΈˆκΉŒμ§€ λ‚΄κ°€ λ‚΄ μžμ‹ μ˜ λ‘œκ·Έμ—μ„œ μΆ”μΈ‘ ν•  수, κ·Έκ²ƒμ€μ²˜λŸΌ λ³΄μ΄λŠ” tensorflow μ‚¬μš© gevent , 당신은 μ‚¬μš©ν•  수 μ—†μŠ΅λ‹ˆλ‹€ gevent μ—μ„œ 같은 μ‹œκ°„μ— gunicorn . --workers 및 --threads ν”Œλž˜κ·ΈλŠ” μ•„λ¬΄λŸ° 차이가 μ—†μ§€λ§Œ --worker-class=gevent μ—μ„œ --worker-class=gthread ν•˜λ©΄ λ¬Έμ œκ°€ ν•΄κ²°λ˜μ—ˆμŠ΅λ‹ˆλ‹€. @javabrett 감사

μ•ˆλ…•ν•˜μ„Έμš”! gevent의 κ΄€λ¦¬μžμ΄μž 이 ν”„λ‘œμ νŠΈμ˜ κΈ°μ—¬μžλ‘œμ„œ μ €λŠ” gevent와 gunicorn이 ν•¨κ»˜ 잘 μž‘λ™ν•œλ‹€κ³  λ‹¨ν˜Έν•˜κ²Œ 말할 수 μžˆμŠ΅λ‹ˆλ‹€. λ‹€μ–‘ν•œ λΌμ΄λΈŒλŸ¬λ¦¬κ°€ κ°„μ„­ν•  수 μžˆμ§€λ§Œ μ΄λŠ” gunicornμ΄λ‚˜ gevent의 잘λͺ»μ΄ μ•„λ‹™λ‹ˆλ‹€. 그렇지 μ•Šμ€ 경우 μƒˆ 문제λ₯Ό μ—¬μ‹­μ‹œμ˜€. 감사 ν•΄μš”!

이 νŽ˜μ΄μ§€κ°€ 도움이 λ˜μ—ˆλ‚˜μš”?
0 / 5 - 0 λ“±κΈ‰