Empezar a usar un contenedor de aprendizaje profundo local
Organízate con las colecciones
Guarda y clasifica el contenido según tus preferencias.
En esta página se describe cómo crear y configurar un contenedor de aprendizaje profundo local.
En esta guía se da por hecho que tienes conocimientos básicos sobre Docker.
Antes de empezar
Sigue estos pasos para configurar una cuenta, habilitar las APIs necesarias e instalar y activar el software necesario. Google Cloud
En la Google Cloud consola, ve a la página Gestionar recursos y selecciona o crea un proyecto.
Si usas un sistema operativo basado en Linux, como Ubuntu o Debian, añade tu nombre de usuario al grupo docker para poder ejecutar Docker sin usar sudo:
sudousermod-a-Gdocker${USER}
Es posible que tengas que reiniciar el sistema después de añadirte al grupo docker.
Abre Docker. Para asegurarte de que Docker se está ejecutando, ejecuta el siguiente comando de Docker, que devuelve la hora y la fecha actuales:
docker run busybox date
Usa gcloud como asistente de credenciales para Docker:
gcloud auth configure-docker
Opcional: Si quieres ejecutar el contenedor con una GPU de forma local, instala nvidia-docker.
Crear un contenedor
Sigue estos pasos para crear tu contenedor.
Para ver una lista de los contenedores disponibles, sigue estos pasos:
gcloud container images list \
--repository="gcr.io/deeplearning-platform-release"
Puede consultar la sección Elegir un contenedor para que le ayude a seleccionar el contenedor que quiera.
Si no necesitas usar un contenedor con GPU, introduce el siguiente código de ejemplo. Sustituye tf-cpu.1-13 por el nombre del contenedor que quieras usar.
docker run -d -p 8080:8080 -v /path/to/local/dir:/home/jupyter \
gcr.io/deeplearning-platform-release/tf-cpu.1-13
Si quieres usar un contenedor habilitado para GPU, introduce el siguiente código de ejemplo. Sustituye tf-gpu.1-13 por el nombre del contenedor que quieras usar.
docker run --runtime=nvidia -d -p 8080:8080 -v /path/to/local/dir:/home/jupyter \
gcr.io/deeplearning-platform-release/tf-gpu.1-13
Este comando inicia el contenedor en modo independiente, activa el directorio local /path/to/local/dir en /home/jupyter en el contenedor y asigna el puerto 8080 del contenedor al puerto 8080 de tu máquina local. El contenedor está preconfigurado para iniciar un servidor de JupyterLab, al que puedes acceder en http://localhost:8080.
Siguientes pasos
Consulta más información sobre cómo trabajar con contenedores en la documentación de Docker.
[[["Es fácil de entender","easyToUnderstand","thumb-up"],["Me ofreció una solución al problema","solvedMyProblem","thumb-up"],["Otro","otherUp","thumb-up"]],[["Es difícil de entender","hardToUnderstand","thumb-down"],["La información o el código de muestra no son correctos","incorrectInformationOrSampleCode","thumb-down"],["Me faltan las muestras o la información que necesito","missingTheInformationSamplesINeed","thumb-down"],["Problema de traducción","translationIssue","thumb-down"],["Otro","otherDown","thumb-down"]],["Última actualización: 2025-08-21 (UTC)."],[[["\u003cp\u003eThis guide details the process of creating and setting up a local deep learning container, requiring basic Docker knowledge.\u003c/p\u003e\n"],["\u003cp\u003eThe setup involves creating or selecting a Google Cloud project, installing and initializing the gcloud CLI, and installing Docker, with specific instructions for Linux users to avoid using \u003ccode\u003esudo\u003c/code\u003e.\u003c/p\u003e\n"],["\u003cp\u003eUsers can choose from available deep learning containers using a command to list them or visit the "Choosing a container" page, then using a command to either use a cpu container, or a gpu-enabled container.\u003c/p\u003e\n"],["\u003cp\u003eThe container is launched in detached mode, mounting a local directory to the container and mapping a port, which then allows the user to use a preconfigured JupyterLab server.\u003c/p\u003e\n"],["\u003cp\u003eOptionally, for those requiring GPU acceleration, the guide suggests installing \u003ccode\u003envidia-docker\u003c/code\u003e, and using the appropriate container creation command.\u003c/p\u003e\n"]]],[],null,["# Get started with a local deep learning container\n\nThis page describes how to create and set up a local deep learning container.\nThis guide expects you to have basic familiarity\nwith [Docker](https://www.docker.com/).\n\nBefore you begin\n----------------\n\nComplete the following steps to set up a Google Cloud account, enable\nthe required APIs, and install and activate the required software.\n\n1. In the Google Cloud Console, go to the **Manage resources** page\n and select or create a project.\n\n | **Note:** If you don't plan to keep the resources you create in this tutorial, create a new project instead of selecting an existing project. After you finish, you can delete the project, removing all resources associated with the project and tutorial.\n\n [Go to Manage\n resources](https://console.cloud.google.com/cloud-resource-manager)\n2. [Install and initialize the\n gcloud CLI](/sdk/docs).\n\n3. [Install Docker](https://docs.docker.com/install/).\n\n If you're using a Linux-based operating system, such as Ubuntu or Debian,\n add your username to the `docker` group so that you can run Docker\n without using `sudo`: \n\n sudo usermod -a -G docker ${USER}\n\n | **Caution:** The `docker` group is equivalent to the `root` user. See [Docker's documentation](https://docs.docker.com/engine/security/security/#docker-daemon-attack-surface) for details on how this affects the security of your system.\n\n You may need to restart your system after adding yourself to\n the `docker` group.\n4. Open Docker. To ensure that Docker is running, run the following\n Docker command, which returns the current time and date:\n\n docker run busybox date\n\n5. Use `gcloud` as the credential helper for Docker:\n\n gcloud auth configure-docker\n\n6. **Optional** : If you want to run the container using GPU locally,\n install\n [`nvidia-docker`](https://github.com/NVIDIA/nvidia-docker#quickstart).\n\nCreate your container\n---------------------\n\nFollow these steps to create your container.\n\n1. To view a list of containers available:\n\n gcloud container images list \\\n --repository=\"gcr.io/deeplearning-platform-release\"\n\n You may want to go to [Choosing a container](/deep-learning-containers/docs/choosing-container)\n to help you select the container that you want.\n2. If you don't need to use a GPU-enabled container, enter the following code\n example. Replace \u003cvar translate=\"no\"\u003etf-cpu.1-13\u003c/var\u003e with the name of the container\n that you want to use.\n\n docker run -d -p 8080:8080 -v /path/to/local/dir:/home/jupyter \\\n gcr.io/deeplearning-platform-release/\u003cvar translate=\"no\"\u003etf-cpu.1-13\u003c/var\u003e\n\n If you want to use a GPU-enabled container, enter the following code\n example. Replace \u003cvar translate=\"no\"\u003etf-gpu.1-13\u003c/var\u003e with the name of the container\n that you want to use. \n\n docker run --runtime=nvidia -d -p 8080:8080 -v /path/to/local/dir:/home/jupyter \\\n gcr.io/deeplearning-platform-release/\u003cvar translate=\"no\"\u003etf-gpu.1-13\u003c/var\u003e\n\nThis command starts up the container in detached mode, mounts the local\ndirectory `/path/to/local/dir` to `/home/jupyter` in the container, and maps\nport 8080 on the container to port 8080 on your local machine. The\ncontainer is preconfigured to start a JupyterLab server, which you can\nvisit at `http://localhost:8080`.\n\nWhat's next\n-----------\n\n- Learn more about how to work with containers in the [Docker\n documentation](https://docs.docker.com)."]]