Some Azure pipeline pitfalls to be aware of
1. In conditions, or conditional insertion, use ${ { var }}
, otherwise use $(var)
for variables.
Note the difference of the script variables denoted as $var
or ${var}
.
E.g.
2. docker run
or docker exec
or docker stop
may return before the command actually finishes.
Possible solutions:
- use
sh -c cmd1.sh && wait
to wait for the background processes to finish - use
[ "$(docker ps -q -f id=$ID)" ] && docker wait $ID
fordocker stop
- use loop and sleep
3. If “script” exits with 1 then the “succeeded()” in the following step will fail. E.g.
[ $? -ne 0 ] && echo ''
can exit 1
4. After copying to $(Build.ArtifactStagingDirectory)
, and calling PublishPipelineArtifact@1
, then downloading with DownloadPipelineArtifact@2
, the file permissions may be changed to readonly.
Possible solution: Grant execution access after downloading from artifact
5. If we use “git commit –amend” then some files may not get updated when running azure pipeline.
E.g. The change history of file content is like: a->b->b
6. Using a “cp” command in “script” to copy stuff to a non-existing directory will create the directory and then returns without copying.
E.g. cp myFile ./non-existing-dir/