"[" is a command. It's actually syntactic sugar for the built-in command test which checks and compares its arguments. The "]" is actually an argument to the [ command that tells it to stop checking for arguments!
12More
Bash If Statements: Beginner to Advanced - DEV Community - 0 views
-
why > and < get weird inside single square brackets -- Bash actually thinks you're trying to do an input or output redirect inside a command!
-
the [[ double square brackets ]] and (( double parens )) are not exactly commands. They're actually Bash language keywords, which is what makes them behave a little more predictably.
- ...8 more annotations...
-
The [[ double square brackets ]] work essentially the same as [ single square brackets ], albeit with some more superpowers like more powerful regex support.
-
If the regex works out, the return code of the double square brackets is 0, and thus the function returns 0. If not, everything returns 1. This is a really great way to name regexes.
-
the stuff immediately after the if can be any command in the whole wide world, as long as it provides an exit code, which is pretty much always.
Enable Rolling updates in Kubernetes with Zero downtime - 0 views
1More
實戰篇-打造人性化 Telegram Bot - zaoldyeck - Medium - 0 views
FreeOTP - 0 views
Textwrap.dedent() does not work - Python - Learn Code Forum - 1 views
109More
Template Designer Documentation - Jinja2 Documentation (2.10) - 0 views
- ...106 more annotations...
-
the default behavior is to evaluate to an empty string if printed or iterated over, and to fail for every other operation.
-
if an object has an item and attribute with the same name. Additionally, the attr() filter only looks up attributes.
-
Variables can be modified by filters. Filters are separated from the variable by a pipe symbol (|) and may have optional arguments in parentheses.
-
to find out if a variable is defined, you can do name is defined, which will then return true or false depending on whether name is defined in the current template context.
-
strip whitespace in templates by hand. If you add a minus sign (-) to the start or end of a block (e.g. a For tag), a comment, or a variable expression, the whitespaces before or after that block will be removed
-
Template inheritance allows you to build a base “skeleton” template that contains all the common elements of your site and defines blocks that child templates can override.
-
The {% extends %} tag is the key here. It tells the template engine that this template “extends” another template.
-
if the block is replaced by a child template, a variable would appear that was not defined in the block or passed to the context.
-
If you have a variable that may include any of the following chars (>, <, &, or ") you SHOULD escape it unless the variable contains well-formed and trusted HTML.
-
Jinja2 functions (macros, super, self.BLOCKNAME) always return template data that is marked as safe.
-
add the recursive modifier to the loop definition and call the loop variable with the new iterable where you want to recurse.
-
a block tag works in “both” directions. That is, a block tag doesn’t just provide a placeholder to fill - it also defines the content that fills the placeholder in the parent.
-
Assignments at top level (outside of blocks, macros or loops) are exported from the template like top level macros and can be imported by other templates.
-
The include statement is useful to include a template and return the rendered contents of that file into the current namespace
-
imports are cached and imported templates don’t have access to the current template variables, just the globals by default.
-
default(value, default_value=u'', boolean=False)¶ If the value is undefined it will return the passed default value, otherwise the value of the variable
-
dictsort(value, case_sensitive=False, by='key', reverse=False)¶ Sort a dict and yield (key, value) pairs.
-
grouping by is stored in the grouper attribute and the list contains all the objects that have this grouper in common.
-
indent(s, width=4, first=False, blank=False, indentfirst=None)¶ Return a copy of the string with each line indented by 4 spaces. The first line and blank lines are not indented by default.
-
join(value, d=u'', attribute=None)¶ Return a string which is the concatenation of the strings in the sequence.
-
reject()¶ Filters a sequence of objects by applying a test to each object, and rejecting the objects with the test succeeding.
-
replace(s, old, new, count=None)¶ Return a copy of the value with all occurrences of a substring replaced with a new one.
-
select()¶ Filters a sequence of objects by applying a test to each object, and only selecting the objects with the test succeeding.
-
sort(value, reverse=False, case_sensitive=False, attribute=None)¶ Sort an iterable. Per default it sorts ascending, if you pass it true as first argument it will reverse the sorting.
-
tojson(value, indent=None)¶ Dumps a structure to JSON so that it’s safe to use in <script> tags.
-
unique(value, case_sensitive=False, attribute=None)¶ Returns a list of unique items from the the given iterable
-
urlize(value, trim_url_limit=None, nofollow=False, target=None, rel=None)¶ Converts URLs in plain text into clickable links.
-
A joiner is passed a string and will return that string every time it’s called, except the first time (in which case it returns an empty string).
-
The with statement makes it possible to create a new inner scope. Variables set within this scope are not visible outside of the scope.
-
With both trim_blocks and lstrip_blocks enabled, you can put block tags on their own lines, and the entire block line will be removed when rendered, preserving the whitespace of the contents
34More
Top 5 Kubernetes Best Practices From Sandeep Dinesh (Google) - DZone Cloud - 0 views
- ...29 more annotations...
-
There’s a lot wrong with this: you could be using the wrong version of code that has exploits, has a bug in it, or worse it could have malware bundled in on purpose—you just don’t know.
-
-
HTTP Toolbox | LornaJane - 1 views
59More
Queues - Laravel - The PHP Framework For Web Artisans - 0 views
-
Laravel queues provide a unified API across a variety of different queue backends, such as Beanstalk, Amazon SQS, Redis, or even a relational database.
- ...56 more annotations...
-
any given queue connection may have multiple "queues" which may be thought of as different stacks or piles of queued jobs.
-
if you dispatch a job without explicitly defining which queue it should be dispatched to, the job will be placed on the queue that is defined in the queue attribute of the connection configuration
-
pushing jobs to multiple queues can be especially useful for applications that wish to prioritize or segment how jobs are processed
-
Job classes are very simple, normally containing only a handle method which is called when the job is processed by the queue.
-
we were able to pass an Eloquent model directly into the queued job's constructor. Because of the SerializesModels trait that the job is using, Eloquent models will be gracefully serialized and unserialized when the job is processing.
-
When the job is actually handled, the queue system will automatically re-retrieve the full model instance from the database.
-
When using this method, the job will not be queued and will be run immediately within the current process
-
Deleting jobs using the $this->delete() method will not prevent chained jobs from being processed. The chain will only stop executing if a job in the chain fails.
-
this does not push jobs to different queue "connections" as defined by your queue configuration file, but only to specific queues within a single connection.
-
to defining how many times a job may be attempted before it fails, you may define a time at which the job should timeout.
-
using the funnel method, you may limit jobs of a given type to only be processed by one worker at a time
-
using the throttle method, you may throttle a given type of job to only run 10 times every 60 seconds.
-
If an exception is thrown while the job is being processed, the job will automatically be released back onto the queue so it may be attempted again.
-
dispatch a Closure. This is great for quick, simple tasks that need to be executed outside of the current request cycle
-
When dispatching Closures to the queue, the Closure's code contents is cryptographically signed so it can not be modified in transit.
-
once the queue:work command has started, it will continue to run until it is manually stopped or you close your terminal
-
customize your queue worker even further by only processing particular queues for a given connection
-
The --stop-when-empty option may be used to instruct the worker to process all jobs and then exit gracefully.
-
Since queue workers are long-lived processes, they will not pick up changes to your code without being restarted.
-
the queue workers will die when the queue:restart command is executed, you should be running a process manager such as Supervisor to automatically restart the queue workers.
-
each queue connection defines a retry_after option. This option specifies how many seconds the queue connection should wait before retrying a job that is being processed.
-
The --timeout option specifies how long the Laravel queue master process will wait before killing off a child queue worker that is processing a job.
-
When jobs are available on the queue, the worker will keep processing jobs with no delay in between them.
-
While sleeping, the worker will not process any new jobs - the jobs will be processed after the worker wakes up again
-
the numprocs directive will instruct Supervisor to run 8 queue:work processes and monitor all of them, automatically restarting them if they fail.
-
define a failed method directly on your job class, allowing you to perform job specific clean-up when a failure occurs.
-
When injecting an Eloquent model into a job, it is automatically serialized before being placed on the queue and restored when the job is processed
11More
Queue Workers: How they work - Diving Laravel - 0 views
-
define workers as a simple PHP process that runs in the background with the purpose of extracting jobs from a storage space and run them with respect to several configuration options.
- ...7 more annotations...
-
instruct Laravel to create an instance of your application and start executing jobs, this instance will stay alive indefinitely which means the action of starting your Laravel application happens only once when the command was run & the same instance will be used to execute your jobs
-
Using queue:listen ensures that a new instance of the app is created for every job, that means you don't have to manually restart the worker in case you made changes to your code, but also means more server resources will be consumed.
6More
Introduction To The Queue System - Diving Laravel - 0 views
-
The QueueManager is registered into the container and it knows how to connect to the different built-in queue drivers
-
for example when we called the Queue::push() method, what happened is that the manager selected the desired queue driver, connected to it, and called the push method on that driver.
- ...2 more annotations...
-
All calls to methods that don't exist in the QueueManager class will be sent to the loaded driver
-
when you do Queue::push() you're actually calling the push method on the queue driver you're using
15More
DevOps - 0 views
-
云平台主要从以下3个方面对DevOps提供支撑(括号内为承载此能力的软件工具): 1. 基于IaaS的自服务与环境编排能力(VMWare) 2. 基于PaaS的弹性伸缩能力(K8s) 3. 基于SaaS的软件服务能力
- ...11 more annotations...
33More
Scalable architecture without magic (and how to build it if you're not Google) - DEV Co... - 0 views
- ...29 more annotations...
-
To run NodeJS on multiple cores, you have to use something like PM2, but since this you have to keep your code stateless.
-
Python have very rich and sugary syntax that’s great for working with data while keeping your code small and expressive.
-
Only the first user will trigger a data query, and all others will be receiving exactly the same data straight from the RAM
-
a rate limiter – if there’s not enough time have passed since last request, the ongoing request will be denied.
-
Backend should have different responsibilities: hashing, building web pages from data and templates, managing sessions and so on.
-
For anything related to data management or data models, move it to your database as procedures or queries.
5More
The Twelve-Factor App - 0 views
-
One-off admin processes should be run in an identical environment as the regular long-running processes of the app.
- ...2 more annotations...
-
Twelve-factor strongly favors languages which provide a REPL shell out of the box, and which make it easy to run one-off scripts.
5More
The Twelve-Factor App - 0 views
-
Logs are the stream of aggregated, time-ordered events collected from the output streams of all running processes and backing services.
- ...2 more annotations...
-
long-term archival. These archival destinations are not visible to or configurable by the app, and instead are completely managed by the execution environment.
-
Most significantly, the stream can be sent to a log indexing and analysis system such as Splunk, or a general-purpose data warehousing system such as Hadoop/Hive.
8More
The Twelve-Factor App - 0 views
-
The twelve-factor app is designed for continuous deployment by keeping the gap between development and production small
- ...4 more annotations...
-
Backing services, such as the app’s database, queueing system, or cache, is one area where dev/prod parity is important
-
The twelve-factor developer resists the urge to use different backing services between development and production, even when adapters theoretically abstract away any differences in backing services.
-
declarative provisioning tools such as Chef and Puppet combined with light-weight virtual environments such as Docker and Vagrant allow developers to run local environments which closely approximate production environments.
-
all deploys of the app (developer environments, staging, production) should be using the same type and version of each of the backing services.
7More
The Twelve-Factor App - 0 views
- ...4 more annotations...
-
all jobs are reentrant, which typically is achieved by wrapping the results in a transaction, or making the operation idempotent
-
Processes should also be robust against sudden death, in the case of a failure in the underlying hardware.
6More
The Twelve-Factor App - 0 views
-
Java processes take the opposite approach, with the JVM providing one massive uberprocess that reserves a large block of system resources (CPU and memory) on startup, with concurrency managed internally via threads
-
Processes in the twelve-factor app take strong cues from the unix process model for running service daemons.
- ...3 more annotations...
View AllMost Active Members
View AllTop 10 Tags
- 151system
- 133programming
- 102docker
- 101rails
- 89development
- 83devops
- 81kubernetes
- 80javascript
- 77database
- 71ruby
- 68linux
- 64web
- 61server
- 58networking
- 52security
- 49python
- 42mysql
- 42php
- 40framework
- 35performance