Page tree

Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Scripts can be combined with the pipe character (|).  Examples will be provided.
  • Results can be captured in log files using > or >> .  See examples throughout the page

Update SSDT Utils  

It is important to use the latest SSDT scripts. These can be updated daily using a cron  job.  The following example cron entry updates the ssdt utilities at 1:00 a.m. and writes/overwrites /data/update-utilities.log  Note there was an issue with the update.sh script that caused it to not work completely with a cron job.  For one time only, this will need to be run manually to pull the updates.

Code Block
##run manually
/ssdt/update.sh

##cron job entry
0 1 * * *   /ssdt/update.sh default 2>&1 > /data/update-utilities.log

Applying Updates (updates-pull.sh, updates-apply.sh, updates-cleanup.sh)

To apply updates, they first need to be pulled, then updates applied.  Optionally, older versions of the images can be removed. Three scripts are used to accomplish this task. First updates to the images are pulled (updates-pull.sh), then they are applied (updates-apply.sh).   Optionally, old containers can be removed (updates-cleanup.sh).  This process can be automated using a cron job. See the section later in the document for more information.

Code Block
/ssdt/scripts/updates-pull.sh
docker pull docker.ssdt.io/usps-app:qa
qa: Pulling from usps-app
Digest: sha256:dd0a94e691f7292eff2a454b52a2b3c286724b4b98cbf9abc06f9dac157d57dc
Status: Image is up to date for docker.ssdt.io/usps-app:qa
docker pull docker.ssdt.io/usps-import:qa
qa: Pulling from usps-import
Digest: sha256:0fc179606d09efca36a58c4eeb179f742b49e44bff1f9dcc9ccdd672e1645084
Status: Image is up to date for docker.ssdt.io/usps-import:qa

-------------

/ssdt/scripts/updates-apply.sh /data/pilot
----
ctec: checking services

ctec: Updating usasapp
WARNING: The Docker Engine you're using is running in swarm mode.

Compose does not use swarm mode to deploy services to multiple nodes in a swarm. All containers will be scheduled on the current node.

To deploy your application across the swarm, use `docker stack deploy`.

Starting ctec_usasdb_1 ...
Starting ctec_usasdb_1 ... done
Starting ctec_usasapp_1 ...
Starting ctec_usasapp_1 ... done

ctec: Updating uspsapp
WARNING: The Docker Engine you're using is running in swarm mode.

Compose does not use swarm mode to deploy services to multiple nodes in a swarm. All containers will be scheduled on the current node.

To deploy your application across the swarm, use `docker stack deploy`.

ctec_uspsdb_1 is up-to-date
Recreating ctec_uspsapp_1 ...
Recreating ctec_uspsapp_1 ... done

-------------
/ssdt/scripts/updates-cleanup.sh
Untagged: docker.ssdt.io/usas-app@sha256:636a32b77d670aeb148a31aad22f77fe29ad52126c48ccf670374b061aac9f90
Deleted: sha256:994bd903dd2b50adfa0d5a2658c7d9123d847fdef25f8e539dcd7779ebfa3df4
Deleted: sha256:7a7baf211c302740e744b052df1bb7abbc5aa8c955eac313b3962a4f4c372c42
Error response from daemon: conflict: unable to delete eb53a4148083 (must be forced) - image is being used by stopped container c4b8ad86a225

##The error can be ignored as this means the image is still in use and should not/will not be deleted.

Backup up databases (backup-usas.sh/backup-usps.sh)

Note that in order to put this in a cron job, the exec-all-projects.sh script must be used.

capture.sh

Capture does docker-compose logs but adds --no-color so they are easier to read in an editor.  It is used to capture the docker-compose log(s) from one or all containers.

...

After exiting, you will be returned to the docker host.  There will be a console.log file in the current directory.  Send it to the SSDT with the send.sh command.

clearlocks.sh

Note, clearlocks.sh is to be used ONLY at the request of, and with guidance from, the SSDT.  Occasionally database locks occur. The log file may contain entries similar to: liquibase.exception.LockException: Could not acquire change log lock.  This can be due to the application stopping in the middle of attempting to apply database updates. During database updates, the application must have exclusive access to the database.  If something happens to interrupt this update process, the lock may not be released, and this script is used to release the locks.

console.sh

This script is used in Diagnosing Hung Applications.  The information that follows is taken directly from that page:

...

Code Block
thread ls 
    thread ls | thread dump
    metrics
    exit

d-images.sh (also used in info.sh)

This script provides a formatted listing of ssdt docker images on a server and it includes more information that the docker images command.  It includes the label ID of the image. 

Code Block
##Image listing d-images.sh script (partial listing)
/ssdt/scripts/d-images.sh
[docker.ssdt.io/usas-app:uat] usas.web-default-978
[docker.ssdt.io/usas-import:uat] usas.importapp-default-978
[docker.ssdt.io/usas-app:pilot] usas.web-2.1.1-4
[docker.ssdt.io/usas-import:pilot] usas.importapp-2.1.1-4
[docker-dev.ssdt.io/usas-app:USASR-DCK-974] usas.web-default-974

##Image listing using the docker images command (partial listing)
docker images --filter "label=io.ssdt.id"
REPOSITORY                    TAG                 IMAGE ID            CREATED             SIZE
docker.ssdt.io/usas-app       uat                 3dd369a9ec6f        2 days ago          588MB
docker.ssdt.io/usas-import    uat                 598ff16046fc        2 days ago          445MB
docker.ssdt.io/usas-app       pilot               18eeebc050e3        3 days ago          588MB
docker.ssdt.io/usas-import    pilot               690e50cbb555        3 days ago          446MB
docker-dev.ssdt.io/usas-app   USASR-DCK-974       03be781d86b7        4 days ago          588MB

d-ps.sh (also used in info.sh)

This script provides a listing of containers on the server, including ones that have been stopped. It formats the output so that few columns are displayed.

Code Block
##Using d-ps.sh script (partial listing)
/ssdt/scripts/d-ps.sh
NAMES                     IMAGE                                       STATUS                    id                       app                 type
training00_usasapp_1      docker.ssdt.io/usas-app:pilot               Up 2 days                 usas.web-2.1.1-4         usas                webapp
training00_uspsapp_1      docker.ssdt.io/usps-app:pilot               Up 2 days                 usps.web-2.0.0-136       usps                webapp
training00_usasdb_1       docker.ssdt.io/trainingdb-usas:pilot        Up 2 days                                                              db
training00_uspsdb_1       docker.ssdt.io/trainingdb-usps:pilot        Up 2 days                                                              db
djs_usasapp_1             docker-dev.ssdt.io/usas-app:USASR-DCK-974   Up 2 days                 usas.web-default-974     usas                webapp
djs_uspsapp_1             docker.ssdt.io/usps-app:pilot               Up 2 days                 usps.web-2.0.0-136       usps                webapp
djs_usasdb_1              docker-dev.ssdt.io/trainingdb-usas:latest   Up 2 days                                                              db
djs_uspsdb_1              docker-dev.ssdt.io/trainingdb-usps:latest   Up 2 days                                                              db


##docker ps -a (partial listing)
docker ps -a
a955cad6aeaf        docker.ssdt.io/usas-app:pilot               "catalina.sh run"        2 days ago          Up 2 days                 8080/tcp, 44000/tcp                        training00_usasapp_1
3fd645864407        docker.ssdt.io/usps-app:pilot               "catalina.sh run"        2 days ago          Up 2 days                 8080/tcp, 44001/tcp                        training00_uspsapp_1
016c8d2c2d5f        docker.ssdt.io/trainingdb-usas:pilot        "ssdt-entrypoint.s..."   2 days ago          Up 2 days                 5432/tcp                                   training00_usasdb_1
ad103690c489        docker.ssdt.io/trainingdb-usps:pilot        "ssdt-entrypoint.s..."   2 days ago          Up 2 days                 5432/tcp                                   training00_uspsdb_1
b69fec84a67a        docker-dev.ssdt.io/usas-app:USASR-DCK-974   "catalina.sh run"        2 days ago          Up 2 days                 8080/tcp, 44000/tcp                        djs_usasapp_1
2af03c034c8f        docker.ssdt.io/usps-app:pilot               "catalina.sh run"        2 days ago          Up 2 days                 8080/tcp, 44001/tcp                        djs_uspsapp_1
b9b9f8175237        docker-dev.ssdt.io/trainingdb-usas:latest   "ssdt-entrypoint.s..."   2 days ago          Up 2 days                 0.0.0.0:8432->5432/tcp                     djs_usasdb_1
73694cd88435        docker-dev.ssdt.io/trainingdb-usps:latest   "ssdt-entrypoint.s..."   2 days ago          Up 2 days                 0.0.0.0:9432->5432/tcp                     djs_uspsdb_1

import-usxs.sh (import-usas.sh, import-usps.sh)

info.sh (uses d-images.sh, d-ps.sh)

This script provides various information about the docker host including docker version, docker-compose version, image information (using d-images.sh), container information (using d-ps.sh), and OS processes.  This can be piped to a file, or combined with send.sh to send the results to the ssdt. This may be used in troubleshooting the application.

Code Block
##partial run of info.sh
/ssdt/scripts/info.sh
4.3.48(1)-release
Client:
 Version:      17.06.2-ce
...
Server:
 Version:      17.06.2-ce
...
docker-compose version 1.16.1, build 6d1ac21
docker-py version: 2.5.1
CPython version: 2.7.13
OpenSSL version: OpenSSL 1.0.1t  3 May 2016
------- Docker info-------------
Containers: 13
 Running: 8
 Paused: 0
 Stopped: 5
Images: 14
...
------- Images -----------------
[docker.ssdt.io/usas-app:uat] usas.web-default-978
[docker.ssdt.io/usas-app:pilot] usas.web-2.1.1-4
...
------- Containers------
NAMES               IMAGE                                     STATUS                     id                   app                 type
test_usasapp_1      docker.ssdt.io/usas-app:pilot             Up 3 days                  usas.web-2.1.1-4     usas                webapp
test_uspsapp_1      docker.ssdt.io/usps-app:pilot             Up 6 days                  usps.web-2.0.0-136   usps                webapp
...
------- OS Processes------------
top - 11:09:52 up 47 days, 21:01,  1 user,  load average: 0.06, 0.06, 0.06
Tasks: 269 total,   1 running, 268 sleeping,   0 stopped,   0 zombie
%Cpu(s):  0.7 us,  0.2 sy,  0.0 ni, 99.0 id,  0.0 wa,  0.0 hi,  0.1 si,  0.0 st
KiB Mem :  8170172 total,  1645124 free,  4663368 used,  1861680 buff/cache
KiB Swap:  1044476 total,  1013596 free,    30880 used.  3134186 avail Mem

  PID USER      PR  NI    VIRT    RES    SHR S  %CPU %MEM     TIME+ COMMAND
    1 root      20   0   37900   3804   2020 S   0.0  0.0   0:31.87 systemd
    2 root      20   0       0      0      0 S   0.0  0.0   0:00.10 kthreadd
...

##send to the ssdt
/ssdt/scripts/info.sh | /ssdt/scripts/send.sh -
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0WARNING: No swap limit support
100 26094  100    73  100 26021    227  81252 --:--:-- --:--:-- --:--:-- 81315

uploaded file as: ssdt-docker-dev-09_scripts_-_2017-10-09T15-07-41.530Z

metrics.sh

This script gathers metrics about a specific container. It is generally done at the request of the ssdt to help diagnose application s;owness issues. The script must be run from a directory containing the docker-compose.yml for container.  The script generates a lot of information, so it is best piped to a logfile or sent to the ssdt. 

Code Block
##Pipe to a file
/data/pilot/test# /ssdt/scripts/metrics.sh usasapp > testmetrics.txt

##Send to the ssdt
/data/pilot/test# /ssdt/scripts/metrics.sh usasapp  | /ssdt/scripts/send.sh -
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  251k  100    70  100  251k     36   132k  0:00:01  0:00:01 --:--:--  132k

uploaded file as: ssdt-docker-dev-09_test_-_2017-10-09T15-38-02.557Z

pid2name.sh

This is used to get the container name based on a process id.  For example, to find what processes are using a lot of resources, use the top command and filter it by java (see tuning tool in Linux Cheat Sheet), Then use the pid2name.sh script to find what container is associated with a specific PID.  Example top listing filtered to show processes using the java command:

...

Code Block
/ssdt/scripts/pid2name.sh 14032
noacsc_uspsapp_1

restore-usxs.sh (restore-usas.sh, restore-usps.sh)

More details to follow.

send.sh

This is used to send files securely to the SSDT.  More information is available here. Note that the script does not notify the SSDT that a file has been sent, so you will need to notify the SSDT. 

Code Block
##send the update-utilities.log
/ssdt/scripts/send.sh update-utilities.log 
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  1542  100    90  100  1452     67   1092  0:00:01  0:00:01 --:--:--  1093

uploaded file as: ssdt-docker-dev-09_pilot_update-utilities.log_2017-09-28T15-09-35.096Z

##send a backup and modifies the name to include test_troubled_db
/ssdt/scripts/send.sh backup/usasdb.2017-09-28-01-45-01.backup.gz test_troubled_db
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  279M  100   113  100  279M     16  40.1M  0:00:07  0:00:06  0:00:01 11.0M
uploaded file as: ssdt-docker-dev-09_test_troubled_db_usasdb.2017-09-28-01-45-01.backup.gz_2017-09-28T15-11-46.556Z

##send the output of a command
##this example sends the results of df -kh /data (giving information about the /data directory)
df -kh /data | /ssdt/scripts/send.sh -
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   588  100    71  100   517    481   3505 --:--:-- --:--:-- --:--:--  3517

uploaded file as: ssdt-docker-dev-09_pilot_-_2017-09-28T15-16-55.879Z

training.sh

Used to create training instances.  More details to followSee Setup Pilot Training Instances.

exec-all-projects.sh

This script is designed to be used with other commands. It scans for docker projects in a a specified parent path and executes a command or multiple commands.  It looks specifically for a docker-compose.yml file.  It consists of two parameters.The first parameter specifies the parent directory.  The second parameter specifies the command to execute (against the docker-compose project).  Note that this can be used with other ssdt utility scripts and in a cron job.  Examples of using this script with other commands:

Using docker-compose commands with exec-all-projects:

Execute docker-compose ps against all projects under a directory

Code Block
 /ssdt/scripts/exec-all-projects.sh /data/pilot docker-compose ps
----
 docker-compose ps on ctec
----
     Name                  Command              State            Ports
-----------------------------------------------------------------------------
ctec_usasapp_1   catalina.sh run               Exit 143
ctec_usasdb_1    ssdt-entrypoint.sh postgres   Exit 137
ctec_uspsapp_1   catalina.sh run               Up         44001/tcp, 8080/tcp
ctec_uspsdb_1    ssdt-entrypoint.sh postgres   Up         5432/tcp

----
 docker-compose ps on omeresa
----
      Name                    Command             State          Ports
-----------------------------------------------------------------------------
omeresa_usasapp_1   catalina.sh run               Up      44000/tcp, 8080/tcp
omeresa_usasdb_1    ssdt-entrypoint.sh postgres   Up      5432/tcp
omeresa_uspsapp_1   catalina.sh run               Up      44001/tcp, 8080/tcp
omeresa_uspsdb_1    ssdt-entrypoint.sh postgres   Up      5432/tcp

----
 docker-compose ps on lisbon
----
      Name                   Command             State          Ports
----------------------------------------------------------------------------
lisbon_usasapp_1   catalina.sh run               Up      44000/tcp, 8080/tcp
lisbon_usasdb_1    ssdt-entrypoint.sh postgres   Up      5432/tcp
lisbon_uspsapp_1   catalina.sh run               Up      44001/tcp, 8080/tcp
lisbon_uspsdb_1    ssdt-entrypoint.sh postgres   Up      5432/tcp

----
 docker-compose ps on fayette
----
      Name                    Command             State          Ports
-----------------------------------------------------------------------------
fayette_usasapp_1   catalina.sh run               Up      44000/tcp, 8080/tcp
fayette_usasdb_1    ssdt-entrypoint.sh postgres   Up      5432/tcp
fayette_uspsapp_1   catalina.sh run               Up      44001/tcp, 8080/tcp
fayette_uspsdb_1    ssdt-entrypoint.sh postgres   Up      5432/tcp

----
 docker-compose ps on elida
----
     Name                   Command             State          Ports
---------------------------------------------------------------------------
elida_usasapp_1   catalina.sh run               Up      44000/tcp, 8080/tcp
elida_usasdb_1    ssdt-entrypoint.sh postgres   Up      5432/tcp
elida_uspsapp_1   catalina.sh run               Up      44001/tcp, 8080/tcp
elida_uspsdb_1    ssdt-entrypoint.sh postgres   Up      5432/tcp

Execute docker-compose stop against all projects under a directory

Code Block
/ssdt/scripts/exec-all-projects.sh /data/uat docker-compose stop
----
 docker-compose stop on uat1
----
Stopping uat1_usasapp_1 ... done
Stopping uat1_uspsapp_1 ... done
Stopping uat1_uspsdb_1  ... done
Stopping uat1_usasdb_1  ... done

docker-compose stop on uat2
----
Stopping uat2_usasapp_1 ... done
Stopping uat2_uspsapp_1 ... done
Stopping uat2_uspsdb_1  ... done
Stopping uat2_usasdb_1  ... done

Execute docker-compose rm -fv against all projects under a directory

Code Block
/ssdt/scripts/exec-all-projects.sh /data/uat rm -fv

Backup up databases under a directory (note there are separate commands for usas and usps) using exec-all-projects:

Note in this example there was no running usas container for ctec.  The script generated an error for ctec, but continued on through the rest of the directory tree

Code Block
 /ssdt/scripts/exec-all-projects.sh /data/pilot /ssdt/scripts/backup-usas.sh

----
 /ssdt/scripts/backup-usas.sh on ctec
----
starting backup of usasdb for ctec
ERROR: No container found for usasdb_1
ERROR: backup verification FAILED
ERROR:

----
 /ssdt/scripts/backup-usas.sh on omeresa
----
starting backup of usasdb for omeresa
completed backup of usasdb for omeresa to ./backup/usasdb.2017-09-21-09-05-41.backup

----
 /ssdt/scripts/backup-usas.sh on lisbon
----
starting backup of usasdb for lisbon
completed backup of usasdb for lisbon to ./backup/usasdb.2017-09-21-09-05-47.backup

----
 /ssdt/scripts/backup-usas.sh on fayette
----
starting backup of usasdb for fayette
completed backup of usasdb for fayette to ./backup/usasdb.2017-09-21-09-06-01.backup

----
 /ssdt/scripts/backup-usas.sh on elida
----
starting backup of usasdb for elida
completed backup of usasdb for elida to ./backup/usasdb.2017-09-21-09-06-12.backup

Capture the logs files from all containers under a directory and send one long file per project to the SSDT support server using exec-all-projects:

Note in this case, more than one command was used with exec-all-projects.sh

Code Block
/ssdt/scripts/exec-all-projects.sh /data/pilot "/ssdt/scripts/capture.sh | /ssdt/scripts/send.sh -"
----
 /ssdt/scripts/capture.sh | /ssdt/scripts/send.sh - on ctec
----
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0  342k    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
uploaded file as: ssdt-docker-04_ctec_-_2017-09-21T13-48-20.767Z
100  342k  100    66  100  342k     79   413k --:--:-- --:--:-- --:--:--  413k

----
 /ssdt/scripts/capture.sh | /ssdt/scripts/send.sh - on omeresa
----
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
uploaded file as: ssdt-docker-04_omeresa_-_2017-09-21T13-48-22.126Z
100  292k  100    69  100  291k     83   353k --:--:-- --:--:-- --:--:--  353k

----
 /ssdt/scripts/capture.sh | /ssdt/scripts/send.sh - on lisbon
----
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
  0     0    0     0    0     0      0      0 --:--:-- --:--:-- --:--:--     0
uploaded file as: ssdt-docker-04_lisbon_-_2017-09-21T13-48-23.183Z
100  164k  100    68  100  164k    137   331k --:--:-- --:--:-- --:--:--  331k

----
 /ssdt/scripts/capture.sh | /ssdt/scripts/send.sh - on fayette
----
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100 65056  100    69  100 64987    200   184k --:--:-- --:--:-- --:--:--  184k

uploaded file as: ssdt-docker-04_fayette_-_2017-09-21T13-48-24.067Z

----
 /ssdt/scripts/capture.sh | /ssdt/scripts/send.sh - on elida
----
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100  168k  100    67  100  168k    114   288k --:--:-- --:--:-- --:--:--  288k

uploaded file as: ssdt-docker-04_elida_-_2017-09-21T13-48-25.292Z

Using console.sh with exec-all-projects:

This is used to help diagnose a hung application.  Even though this will work with exec-all-properties.sh, it requires user interaction with each container.  It will, however, put a console.log in each projects directory as you go through the process.  This is a partial example:

Code Block
/ssdt/scripts/exec-all-projects.sh /data/pilot /ssdt/scripts/console.sh uspsapp

----
 /ssdt/scripts/console.sh uspsapp on ctec
----
connecting to telnet console on ctec_uspsapp_1 in ctec
Trying 172.29.0.3...
Connected to 172.29.0.3.
Escape character is '^]'.
 :: SSDT Application Console Boot :: (v) on f9a63b125228

#> thread ls
### more things displayed here
3744   logback-2          main          5     WAITIN 0     0:0    false true
3885   Connection1        net.wimpi.tel 5     RUNNAB 0     0:1    false true

#> thread ls | thread dump

### results displayed here


#> metrics

### results displayed here

#> exit
Connection closed by foreign host.

----
 /ssdt/scripts/console.sh uspsapp on omeresa
----
connecting to telnet console on omeresa_uspsapp_1 in omeresa
Trying 172.27.0.2...
Connected to 172.27.0.2.
Escape character is '^]'.
 :: SSDT Application Console Boot :: (v) on a23952b4a357

#>

And so forth - it will connect to each container in the directory tree, and you will need to manually type the commands  Note if you type exit right away at the prompt, it will continue to the next container 
----
 /ssdt/scripts/console.sh uspsapp on fayette
----
connecting to telnet console on fayette_uspsapp_1 in fayette
Trying 172.31.0.5...
Connected to 172.31.0.5.
Escape character is '^]'.
 :: SSDT Application Console Boot :: (v) on 7d11b1a10271

#> exit
Connection closed by foreign host.

----
 /ssdt/scripts/console.sh uspsapp on elida
----
connecting to telnet console on elida_uspsapp_1 in elida
Trying 172.30.0.5...
Connected to 172.30.0.5.
Escape character is '^]'.
 :: SSDT Application Console Boot :: (v) on 94788ae14733

#> exit
Connection closed by foreign host.

Linux Command examples using exec-all-projects:

There may be other ways to use these commands, but these use the exec-all-projects.sh script.  For more details on Linux commands and Linux in general, see Linux Cheat Sheet.

Code Block
##Listing of all files in the directories, using long listing format (l), sort by modification time, but reverse order (tr), with sizes listed in human readable format (h)

/ssdt/scripts/exec-all-projects.sh /data/pilot ls -ltrah
----
 ls -ltrah on ctec
----
total 4.9M
-rw-r--r-- 1 root root  823 Aug  2 12:22 docker-compose.override.yml
-rw-r--r-- 1 root root 1.5M Aug 26 05:32 usasimport.log
-rw------- 1 root root 3.1M Aug 26 05:32 nohup.out
-rw-r--r-- 1 root root 320K Aug 30 16:22 uspsimport.log
-rw-r--r-- 1 root root 3.0K Sep  7 14:13 docker-compose.yml
-rw-r--r-- 1 root root   55 Sep  7 14:13 .docker-compose.md5
-rw-r--r-- 1 root root  402 Sep  8 16:32 .env
drwxr-xr-x 3 root root 4.0K Sep 12 10:54 .
drwxr-xr-x 8 root root 4.0K Sep 21 07:55 ..
drwxr-xr-x 2 root root 4.0K Sep 21 09:05 backup
-rw-r--r-- 1 root root  39K Sep 21 10:06 console.log

----
 ls -ltrah on omeresa
----
total 176K
-rw-r--r-- 1 root root  77K Jul 28 12:27 uspsimport.log
-rw-r--r-- 1 root root  54K Jul 28 12:29 usasimport.log
-rw-r--r-- 1 root root  827 Aug  3 11:25 docker-compose.override.yml
-rw-r--r-- 1 root root 3.0K Sep  7 14:15 docker-compose.yml
-rw-r--r-- 1 root root   55 Sep  7 14:15 .docker-compose.md5
-rw-r--r-- 1 root root  411 Sep  8 11:41 .env
drwxr-xr-x 8 root root 4.0K Sep 21 07:55 ..
drwxr-xr-x 2 root root 4.0K Sep 21 09:05 backup
drwxr-xr-x 3 root root 4.0K Sep 21 09:55 .
-rw-r--r-- 1 root root  192 Sep 21 10:08 console.log

---- and so forth

Using a cron job

Most of the scripts can be used in a cron job.  See Automating Jobs in Ubuntu with Crontab for more examples on using cron specifically with the release.  To use the backup scripts in a cron job, the exec-all-projects.sh script must be used.  The output can be redirected to a log file.  The > syntax overwrites any exiting logfile with the same fully qualified name, and >> appends to any existing logfile with the same fully qualified name (and both will create one if none exists). A date can be added to the log file name.  Note when adding a date, use ` (it is generally on the ~ key) and not '.   Examples:

Code Block
##update utilities at 1:00 a.m. and write to /data/pilot/update-utilities-year-month-day-hour:minute.log (adds a date to the logfile name)
##example logfile name with this command:  update-utilities-2017-09-27-01:00.log
0 1 * * *   /ssdt/update.sh 2>&1 > /data/pilot/update-utilities-`date +\%Y-\%m-\%d-\%H:\%M`.log

## pull updates at 1:10 and apply updates at 1:30 to running containers in the /data/pilot directory tree and write/overwrite the log files
10 1 * * *   /ssdt/scripts/updates-pull.sh 2>&1 > /data/pilot/updates-pull.log
30 1 * * *  /ssdt/scripts/updates-apply.sh /data/pilot 2>&1 > /data/pilot/updates-apply.log

## backup usps databases for running containers in the /data/pilot directory tree at 1:40 and append to the log file
## backup usas databases for running containers in the /data/pilot directory tree at 1:45 and append to the log file
40 1 * * *  /ssdt/scripts/exec-all-projects.sh /data/pilot /ssdt/scripts/backup-usps.sh 2>&1 >> /data/pilot/backup-usps.log
45 1 * * *  /ssdt/scripts/exec-all-projects.sh /data/pilot /ssdt/scripts/backup-usas.sh 2>&1 >> /data/pilot/backup-usas.log

Related articles

Content by Label
showLabelsfalse
max5
spacesrtd
showSpacefalse
sortmodified
reversetrue
typepage
cqllabel in ("backup","docker","scripts") and type = "page" and space = "rtd"
labelsbackup docker scripts

...