CMD would be running an application upon creation of a container which is already installed using RUN (e.g. RUN apt-get install …) inside the image
Docker Explained: Using Dockerfiles to Automate Building of Images | DigitalOcean - 0 views
-
-
ENTRYPOINT argument sets the concrete default application that is used every time a container is created using the image.
-
ENV command is used to set the environment variables (one or more).
- ...6 more annotations...
How To Install and Use Docker: Getting Started | DigitalOcean - 0 views
-
docker as a project offers you the complete set of higher-level tools to carry everything that forms an application across systems and machines - virtual or physical - and brings along loads more of great benefits with it
-
docker daemon: used to manage docker (LXC) containers on the host it runs
-
docker CLI: used to command and communicate with the docker daemon
- ...20 more annotations...
How To Create a Kubernetes Cluster Using Kubeadm on Ubuntu 18.04 | DigitalOcean - 0 views
-
A pod is an atomic unit that runs one or more containers.
-
Pods are the basic unit of scheduling in Kubernetes: all containers in a pod are guaranteed to run on the same node that the pod is scheduled on.
-
Each pod has its own IP address, and a pod on one node should be able to access a pod on another node using the pod's IP.
- ...12 more annotations...
How To Benchmark HTTP Latency with wrk on Ubuntu 14.04 | DigitalOcean - 0 views
-
wrk, which measures the latency of your HTTP services at high loads.
-
Latency refers to the time interval between the moment the request was made (by wrk) and the moment the response was received (from the service).
-
Tests can't be compared to real users, but they should give you a good estimate of expected latency
How To Use Bash's Job Control to Manage Foreground and Background Processes | DigitalOcean - 0 views
-
Most processes that you start on a Linux machine will run in the foreground. The command will begin execution, blocking use of the shell for the duration of the process.
-
By default, processes are started in the foreground. Until the program exits or changes state, you will not be able to interact with the shell.
-
stop the process by sending it a signal
- ...17 more annotations...
Understanding Nginx Server and Location Block Selection Algorithms | DigitalOcean - 0 views
-
A server block is a subset of Nginx’s configuration that defines a virtual server used to handle requests of a defined type. Administrators often configure multiple server blocks and decide which block should handle which connection based on the requested domain name, port, and IP address.
-
A location block lives within a server block and is used to define how Nginx should handle requests for different resources and URIs for the parent server. The URI space can be subdivided in whatever way the administrator likes using these blocks. It is an extremely flexible model.
-
Nginx logically divides the configurations meant to serve different content into blocks, which live in a hierarchical structure. Each time a client request is made, Nginx begins a process of determining which configuration blocks should be used to handle the request.
- ...37 more annotations...
Understanding Nginx HTTP Proxying, Load Balancing, Buffering, and Caching | DigitalOcean - 0 views
-
allow Nginx to pass requests off to backend http servers for further processing
-
Nginx is often set up as a reverse proxy solution to help scale out infrastructure or to pass requests to other servers that are not designed to handle large client loads
-
explore buffering and caching to improve the performance of proxying operations for clients
- ...48 more annotations...
Understanding the Nginx Configuration File Structure and Configuration Contexts | Digit... - 0 views
-
discussing the basic structure of an Nginx configuration file along with some guidelines on how to design your files
-
/etc/nginx/nginx.conf
-
In Nginx parlance, the areas that these brackets define are called "contexts" because they contain configuration details that are separated according to their area of concern
- ...50 more annotations...
An Introduction to HAProxy and Load Balancing Concepts | DigitalOcean - 0 views
-
HAProxy, which stands for High Availability Proxy
-
improve the performance and reliability of a server environment by distributing the workload across multiple servers (e.g. web, application, database).
-
ACLs are used to test some condition and perform an action (e.g. select a server, or block a request) based on the test result.
- ...28 more annotations...
How to Set Up Squid Proxy for Private Connections on Ubuntu 20.04 | DigitalOcean - 0 views
-
it’s a good idea to keep the deny all rule at the bottom of this configuration block.
Packer - 0 views
1 - 13 of 13
Showing 20▼ items per page