Compare commits

..

4 Commits

Author SHA1 Message Date
aa4198af9a Merge pull request #1367 from balena-io/mixpanel-args-v7
Update mixpanel tracking (v7)
2019-07-22 09:00:00 +00:00
e0523ec8fa v7.10.10 2019-07-24 14:29:26 +01:00
230a178228 Update mixpanel tracking
Change-type: patch
Signed-off-by: Paulo Castro <paulo@balena.io>
2019-07-24 14:29:26 +01:00
5d91b2e830 Fix typescript and global promise errors to allow for backport changes
Change-type: patch
Signed-off-by: Paulo Castro <paulo@balena.io>
2019-07-24 14:29:26 +01:00
407 changed files with 11114 additions and 95499 deletions

View File

@ -1,37 +0,0 @@
# Reminders:
# * Matching rules are different to `.gitignore`
# * A pattern without '**' matches in the project's root directory only
# * Leading and trailing '/' are discarded (it is not possible to
# distinguish between files and directories)
# * More details: https://github.com/balena-io-modules/dockerignore
# development and testing tools or IDEs
**/*.log
**/*.pid
**/*.seed
.idea
.lock-wscript
.nvmrc
.nyc_output
.vscode
coverage
lib-cov
logs
pids
# OS cache files
**/.DS_Store
# balena CLI config and build files
**/.balenaconf
**/.fast-boot.json
**/.resinconf
balenarc.yml
build
build-bin
dist
node_modules
oclif.manifest.json
package-lock.json
resinrc.yml
tmp

12
.gitattributes vendored
View File

@ -1,12 +0,0 @@
# Set all files to use line feed endings (since we can't match only ones without an extension)
* eol=lf
# And then reset all the files with extensions back to default
*.* -eol
*.sh text eol=lf
# lf for the docs as it's auto-generated and will otherwise trigger an uncommited error on windows
docs/balena-cli.md text eol=lf
# crlf for the eol conversion test files
tests/test-data/projects/docker-compose/basic/service2/file2-crlf.sh eol=crlf
tests/test-data/projects/no-docker-compose/basic/src/windows-crlf.sh eol=crlf

View File

@ -1,76 +1,2 @@
# About this issue tracker
*The balena CLI (Command Line Interface) is a tool used to interact with the balena platform.
This GitHub issue tracker is used for bug reports and feature requests regarding the CLI
tool. General and troubleshooting questions (such as setting up your project to work with a
balenalib base image) are encouraged to be posted to the [balena
forums](https://forums.balena.io), which are monitored by balena's support team and where the
community can both contribute and benefit from the answers.*
*Please also check that this issue is not a duplicate. If there is another issue describing
the same problem or feature please add comments to the existing issue.*
*Thank you for your time and effort creating the issue report, and helping us improve
the balena CLI!*
---
# Expected Behavior
Please describe what you were expecting to happen. If applicable, please add links to
documentation you were following, or to projects that you were trying to push/build.
# Actual Behavior
Please describe what actually happened instead:
* Quoting logs and error message is useful. If possible, quote the **full** output of the
CLI, not just the error message.
* Please quote the **full command line** too. Sometimes users report that they were
"pushing" or "building" a project, but there are several ways to do so and several
possible "targets" such as balenaCloud, openBalena, local balenaOS device, etc.
Examples:
```
balena push myFleet
balena push 192.168.0.12
balena deploy myFleet
balena deploy myFleet --build
balena build . -f myFleet
balena build . -A armv7hf -d raspberrypi3
```
Each of the above command lines executes different code behind the scenes, so quoting the
full command line is very helpful.
Running the CLI in debug mode (`--debug` flag or `DEBUG=1` environment variable) may reveal
additional information. The `--logs` option reveals additional information for the commands:
```
balena build . --logs
balena deploy myFleet --build --logs
```
# Steps to Reproduce the Problem
This is the most important and helpful part of a bug report. If we cannot reproduce the
problem, it is difficult to tell what the fix should be, or whether code changes have
fixed it.
1.
1.
1.
# Specifications
- **balena CLI version:** e.g. 1.2.3 (output of the `"balena version -a"` command)
- **Cloud backend: openBalena or balenaCloud?** If unsure, it will be balenaCloud
- **Operating system version:** e.g. Windows 10, Ubuntu 18.04, macOS 10.14.5
- **32/64 bit OS and processor:** e.g. 32-bit Windows on 64-bit Intel processor
- **Install method:** npm or zip package or executable installer
- **If npm install, Node.js and npm version:** e.g. Node v8.16.0 and npm v6.4.1
# Additional References
If applicable, please add additional links to GitHub projects, forums.balena.io threads,
gist.github.com, Google Drive attachments, etc.
- **resin-cli version:**
- **Operating system and architecture:**

View File

@ -1,26 +0,0 @@
<!-- You can remove tags that do not apply. -->
Resolves: # <!-- Refer an issue of this repository that this PR fixes -->
Change-type: major|minor|patch <!-- See https://semver.org/ -->
Depends-on: <url> <!-- This change depends on a PR to get merged/deployed first -->
See: <url> <!-- Refer to any external resource, like a PR, document or discussion -->
---
Please check the CONTRIBUTING.md file for relevant information and some
guidance. Keep in mind that the CLI is a cross-platform application that runs
on Windows, macOS and Linux. Tests will be automatically run by balena CI on
all three operating systems, but this will only help if you have added test
code that exercises the modified or added feature code.
Note that each commit message (currently only the first line) will be
automatically copied to the CHANGELOG.md file, so try writing it in a way
that describes the feature or fix for CLI users.
If there isn't a linked issue or if the linked issue doesn't quite match the
PR, please add a PR description to explain its purpose or the features that it
implements. Adding PR comments to blocks of code that aren't self explanatory
usually helps with the review process.
If the PR introduces security considerations or affects the development, build
or release process, please be sure to highlight this in the PR description.
Thank you very much for your contribution!

View File

@ -1,131 +0,0 @@
---
name: package and draft GitHub release
# https://github.com/product-os/flowzone/tree/master/.github/actions
inputs:
json:
description: "JSON stringified object containing all the inputs from the calling workflow"
required: true
secrets:
description: "JSON stringified object containing all the secrets from the calling workflow"
required: true
# --- custom environment
XCODE_APP_LOADER_EMAIL:
type: string
default: "accounts+apple@balena.io"
NODE_VERSION:
type: string
# FIXME: (please) https://github.com/balena-io/balena-cli/issues/2165
default: "12.x"
VERBOSE:
type: string
default: "true"
runs:
# https://docs.github.com/en/actions/creating-actions/creating-a-composite-action
using: "composite"
steps:
- name: Download custom source artifact
uses: actions/download-artifact@v3
with:
name: custom-${{ github.event.pull_request.head.sha || github.event.head_commit.id }}-${{ runner.os }}
path: ${{ runner.temp }}
- name: Extract custom source artifact
shell: pwsh
working-directory: .
run: tar -xf ${{ runner.temp }}/custom.tgz
- name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: ${{ inputs.NODE_VERSION }}
cache: npm
- name: Install additional tools
if: runner.os == 'Windows'
shell: bash
run: |
choco install yq
- name: Install additional tools
if: runner.os == 'macOS'
shell: bash
run: |
brew install coreutils
# https://www.electron.build/code-signing.html
# https://github.com/Apple-Actions/import-codesign-certs
- name: Import Apple code signing certificate
if: runner.os == 'macOS'
uses: apple-actions/import-codesign-certs@v1
with:
p12-file-base64: ${{ fromJSON(inputs.secrets).APPLE_SIGNING }}
p12-password: ${{ fromJSON(inputs.secrets).APPLE_SIGNING_PASSWORD }}
- name: Import Windows code signing certificate
if: runner.os == 'Windows'
shell: powershell
run: |
Set-Content -Path ${{ runner.temp }}/certificate.base64 -Value $env:WINDOWS_CERTIFICATE
certutil -decode ${{ runner.temp }}/certificate.base64 ${{ runner.temp }}/certificate.pfx
Remove-Item -path ${{ runner.temp }} -include certificate.base64
Import-PfxCertificate `
-FilePath ${{ runner.temp }}/certificate.pfx `
-CertStoreLocation Cert:\CurrentUser\My `
-Password (ConvertTo-SecureString -String $env:WINDOWS_CERTIFICATE_PASSWORD -Force -AsPlainText)
env:
WINDOWS_CERTIFICATE: ${{ fromJSON(inputs.secrets).WINDOWS_SIGNING }}
WINDOWS_CERTIFICATE_PASSWORD: ${{ fromJSON(inputs.secrets).WINDOWS_SIGNING_PASSWORD }}
# https://github.com/product-os/scripts/tree/master/shared
# https://github.com/product-os/balena-concourse/blob/master/pipelines/github-events/template.yml
- name: Package release
shell: bash
run: |
set -ea
[[ '${{ inputs.VERBOSE }}' =~ on|On|Yes|yes|true|True ]] && set -x
runner_os="$(echo "${RUNNER_OS}" | tr '[:upper:]' '[:lower:]')"
runner_arch="$(echo "${RUNNER_ARCH}" | tr '[:upper:]' '[:lower:]')"
if [[ $runner_os =~ darwin|macos|osx ]]; then
CSC_KEY_PASSWORD=${{ fromJSON(inputs.secrets).APPLE_SIGNING_PASSWORD }}
CSC_KEYCHAIN=signing_temp
CSC_LINK=${{ fromJSON(inputs.secrets).APPLE_SIGNING }}
elif [[ $runner_os =~ windows|win ]]; then
CSC_KEY_PASSWORD=${{ fromJSON(inputs.secrets).WINDOWS_SIGNING_PASSWORD }}
CSC_LINK='${{ runner.temp }}\certificate.pfx'
# patches/all/oclif.patch
MSYSSHELLPATH="$(which bash)"
MSYSTEM=MSYS
# (signtool.exe) https://github.com/actions/runner-images/blob/main/images/win/Windows2019-Readme.md#installed-windows-sdks
PATH="/c/Program Files (x86)/Windows Kits/10/bin/${runner_arch}:${PATH}"
fi
npm run package
find dist -type f -maxdepth 1
env:
# https://github.blog/2020-08-03-github-actions-improvements-for-fork-and-pull-request-workflows/#improvements-for-public-repository-forks
# https://docs.github.com/en/actions/managing-workflow-runs/approving-workflow-runs-from-public-forks#about-workflow-runs-from-public-forks
CSC_FOR_PULL_REQUEST: true
# https://sectigo.com/resource-library/time-stamping-server
TIMESTAMP_SERVER: http://timestamp.sectigo.com
# Apple notarization (automation/build-bin.ts)
XCODE_APP_LOADER_EMAIL: ${{ inputs.XCODE_APP_LOADER_EMAIL }}
XCODE_APP_LOADER_PASSWORD: ${{ fromJSON(inputs.secrets).XCODE_APP_LOADER_PASSWORD }}
- name: Upload artifacts
uses: actions/upload-artifact@v3
with:
name: gh-release-${{ github.event.pull_request.head.sha || github.event.head_commit.id }}
path: dist
retention-days: 1

View File

@ -1,57 +0,0 @@
---
name: test release
# https://github.com/product-os/flowzone/tree/master/.github/actions
inputs:
json:
description: "JSON stringified object containing all the inputs from the calling workflow"
required: true
secrets:
description: "JSON stringified object containing all the secrets from the calling workflow"
required: true
# --- custom environment
NODE_VERSION:
type: string
# FIXME: (please) https://github.com/balena-io/balena-cli/issues/2165
default: "12.x"
VERBOSE:
type: string
default: "true"
runs:
# https://docs.github.com/en/actions/creating-actions/creating-a-composite-action
using: "composite"
steps:
# https://github.com/actions/setup-node#caching-global-packages-data
- name: Setup Node.js
uses: actions/setup-node@v3
with:
node-version: ${{ inputs.NODE_VERSION }}
cache: npm
- name: Test release
shell: bash
run: |
set -ea
[[ '${{ inputs.VERBOSE }}' =~ on|On|Yes|yes|true|True ]] && set -x
if [[ -e package-lock.json ]]; then
npm ci
else
npm i
fi
npm run build
npm run test
- name: Compress custom source
shell: pwsh
run: tar -acf ${{ runner.temp }}/custom.tgz .
- name: Upload custom artifact
uses: actions/upload-artifact@v3
with:
name: custom-${{ github.event.pull_request.head.sha || github.event.head_commit.id }}-${{ runner.os }}
path: ${{ runner.temp }}/custom.tgz
retention-days: 1

View File

@ -1,16 +0,0 @@
name: Flowzone
on:
pull_request:
types: [opened, synchronize, closed]
branches:
- "main"
- "master"
jobs:
flowzone:
name: Flowzone
uses: product-os/flowzone/.github/workflows/flowzone.yml@master
secrets: inherit
with:
tests_run_on: '["ubuntu-20.04","macos-11","windows-2019"]'

62
.gitignore vendored
View File

@ -1,35 +1,39 @@
# Reminders:
# * A pattern without '/' matches in subdirectories as well (files and directories)
# * A leading '/' anchors matching to the directory where `.gitignore` is defined
# * A trailing '/' makes the pattern match against directories only
# More details: https://git-scm.com/docs/gitignore
# development and testing tools or IDEs
# Logs
logs
*.log
# Runtime data
pids
*.pid
*.seed
/.idea/
/.lock-wscript
/.nyc_output/
/.vscode/
/coverage/
/lib-cov/
/logs
/pids
# OS cache files
# Directory for instrumented libs generated by jscoverage/JSCover
lib-cov
# Coverage directory used by tools like istanbul
coverage
# node-waf configuration
.lock-wscript
# Compiled binary addons (http://nodejs.org/api/addons.html)
build/Release
# Dependency directory
# Commenting this out is preferred by some people, see
# https://www.npmjs.org/doc/misc/npm-faq.html#should-i-check-my-node_modules-folder-into-git-
node_modules
npm-shrinkwrap.json
package-lock.json
.resinconf
resinrc.yml
.idea
.vscode
.DS_Store
# balena CLI config and build files
.balenaconf
.fast-boot.json
.resinconf
/balenarc.yml
/build/
/build-bin/
/dist/
/node_modules
/oclif.manifest.json
/package-lock.json
/resinrc.yml
/tmp/
/tmp
build/
build-bin/
build-zip/

View File

@ -1,2 +1,5 @@
coffee_script:
config_file: coffeelint.json
javascript:
enabled: false

View File

@ -1,6 +0,0 @@
const commonConfig = require('./.mocharc.js');
module.exports = {
...commonConfig,
spec: ['tests/auth/*.spec.ts', 'tests/commands/**/*.spec.ts'],
};

View File

@ -1,10 +0,0 @@
module.exports = {
reporter: 'spec',
require: 'ts-node/register/transpile-only',
file: './tests/config-tests',
timeout: 12000,
// To test only, say, 'push.spec.ts', do it as follows so that
// requests are authenticated:
// spec: ['tests/auth/*.spec.ts', 'tests/**/deploy.spec.ts'],
spec: 'tests/**/*.spec.ts',
};

1
.nvmrc
View File

@ -1 +0,0 @@
12

19
.travis.yml Normal file
View File

@ -0,0 +1,19 @@
language: node_js
os:
- linux
- osx
node_js:
- "6"
before_install:
- npm -g install npm@4
script: npm run ci
notifications:
email: false
deploy:
- provider: script
script: npm run release
skip_cleanup: true
on:
tags: true
condition: "$TRAVIS_TAG =~ ^v?[[:digit:]]+\\.[[:digit:]]+\\.[[:digit:]]+"
repo: balena-io/balena-cli

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -1,302 +0,0 @@
# Contributing
The balena CLI is an open source project and your contribution is welcome!
* Install the dependencies listed in the [NPM Installation
section](./INSTALL-ADVANCED.md#npm-installation) section of the installation instructions. Check
the section [Additional Dependencies](./INSTALL-ADVANCED.md#additional-dependencies) too.
* Clone the `balena-cli` repository (or a [forked
repo](https://docs.github.com/en/free-pro-team@latest/github/getting-started-with-github/fork-a-repo),
if you are not in the balena team), `cd` to it and run `npm install`.
* Build the CLI with `npm run build` or `npm test`, and execute it with `./bin/balena`
(on a Windows command prompt, you may need to run `node .\bin\balena`).
In order to ease development:
* `npm run build:fast` skips some of the build steps for interactive testing, or
* `npm run test:source` skips testing the standalone zip packages (which is rather slow)
* `./bin/balena-dev` uses `ts-node/register` to transpile on the fly.
Before opening a PR, test your changes with `npm test`. Keep compatibility in mind, as the CLI is
meant to run on Linux, macOS and Windows. balena CI will run test code on all three platforms, but
this will only help if you add some test cases for your new code!
## Semantic versioning, commit messages and the ChangeLog
When a pull request is merged, Balena's versionbot / Continuous Integration system takes care of
automatically creating a new CLI release on both the [npm
registry](https://www.npmjs.com/package/balena-cli) and the GitHub [releases
page](https://github.com/balena-io/balena-cli/releases). The release version numbering adheres to
the [Semantic Versioning's](http://semver.org/) concept of patch, minor and major releases.
Generally, bug fixes and documentation changes are classed as patch changes, while new features are
classed as minor changes. If a change breaks backwards compatibility, it is a major change.
A new version entry is also automatically added to the
[CHANGELOG.md](https://github.com/balena-io/balena-cli/blob/master/CHANGELOG.md) file when a pull
request is merged. Each pull request corresponds to a single version / release. Each commit in the
pull request becomes a bullet point entry in the Changelog. The Changelog file should not be
manually edited.
To support this automation, a commit message should be structured as follows:
```text
The first line becomes a bullet point in the CHANGELOG file
Optionally, a more detailed description in one or more paragraphs.
The detailed description can be seen with `git log`, but it is not copied
to the CHANGELOG file.
Change-type: patch|minor|major
```
Only the first line of the commit message is copied to the Changelog file. The `Change-type` footer
must be preceded by a blank line, and indicates the commit's semver change type. When a PR consists
of multiple commits, the commits may have different change type values. As a whole, the PR will
produce a release of the "highest" change type. For example, two commits mixing patch and minor
change types will produce a minor CLI release, while two commits mixing minor and major change
types will produce a major CLI release.
The commit message is parsed / checked by versionbot with the
[resin-commit-lint](https://github.com/balena-io-modules/resin-commit-lint#resin-commit-lint)
package.
Because of the way that the Changelog file is automatically updated from commit messages, which
become the source of "what's new" for CLI end users, we advocate "meaningful commits" and
user-focused commit messages. A meaningful commit is one that, in isolation, introduces a fix or
feature (or part of a fix or feature) that makes sense at the Changelog level, and which leaves the
CLI in a non-broken state. Sometimes, in the course of preparing a single pull request, a developer
creates several commits as a way of saving their "work in progress", which may even fail to build
(e.g. `npm run build` fails), and which is then fixed or undone by further commits in the same PR.
In this situation, the recommendation is to "squash" or "fixup" the work-in-progress commits into
fewer, meaningful commits. Interactive rebase is a good tool to achieve this:
[blog](https://thoughtbot.com/blog/git-interactive-rebase-squash-amend-rewriting-history),
[docs](https://git-scm.com/book/en/v2/Git-Tools-Rewriting-History).
Mixing multiple distinct features or bug fixes in a single commit is discouraged, because the
description will likely not fit in the single-line Changelog bullet point and also because it
makes it harder to review the pull request (especially a large one) and harder to isolate and
revert individual changes in case a bug is found later on. Create a separate commit for each
feature / bug fix, or even separate pull requests.
If you need to catch up with changes to the master branch while working on a pull request,
use rebase instead of merge: [docs](https://git-scm.com/book/en/v2/Git-Branching-Rebasing).
If `package.json` is updated for dependencies listed in the `repo.yml` file (like `balena-sdk`),
the commit message body should also include a line in the following format:
```
Update balena-sdk from 12.0.0 to 12.1.0
```
This allows versionbot to produce nested Changelog entries (with expandable arrows), pulling in
commit messages from the upstream repositories. The following npm script can be used to
automatically produce a commit with a suitable commit message:
```
npm run update balena-sdk ^12.1.0
```
The script will create a new branch (only if `master` is currently checked out), run `npm update`
with the given target version and commit the `package.json` and `npm-shrinkwrap.json` files. The
script by default will set the `Change-type` to `patch` or `minor`, depending on the semver change
of the updated dependency. A `major` change type can specified as an extra argument:
```
npm run update balena-sdk ^12.14.0 patch
npm run update balena-sdk ^13.0.0 major
```
## Editing documentation files (README, INSTALL, Reference website...)
The `docs/balena-cli.md` file is automatically generated by running `npm run build:doc` (which also
runs as part of `npm run build`). That file is then pulled by scripts in the
[balena-io/docs](https://github.com/balena-io/docs/) GitHub repo for publishing at the [CLI
Documentation page](https://www.balena.io/docs/reference/cli/).
The content sources for the auto generation of `docs/balena-cli.md` are:
* [Selected
sections](https://github.com/balena-io/balena-cli/blob/v12.23.0/automation/capitanodoc/capitanodoc.ts#L199-L204)
of the README file.
* The CLI's command documentation in source code (`lib/commands/` folder), for example:
* `lib/commands/push.ts`
* `lib/commands/env/add.ts`
The README file is manually edited, but subsections are automatically extracted for inclusion in
`docs/balena-cli.md` by the `getCapitanoDoc()` function in
[`automation/capitanodoc/capitanodoc.ts`](https://github.com/balena-io/balena-cli/blob/master/automation/capitanodoc/capitanodoc.ts).
The `INSTALL*.md` and `TROUBLESHOOTING.md` files are also manually edited.
## Patches folder
The `patches` folder contains patch files created with the
[patch-package](https://www.npmjs.com/package/patch-package) tool. Small code changes to
third-party modules can be made by directly editing Javascript files under the `node_modules`
folder and then running `patch-package` to create the patch files. The patch files are then
applied immediately after `npm install`, through the `postinstall` script defined in
`package.json`.
The subfolders of the `patches` folder are documented in the
[apply-patches.js](https://github.com/balena-io/balena-cli/blob/master/patches/apply-patches.js)
script.
To make changes to the patch files under the `patches` folder, **do not edit them directly,**
not even for a "single character change" because the hash values in the patch files also need
to be recomputed by `patch-packages`. Instead, edit the relevant files under `node_modules`
directly, and then run `patch-packages` with the `--patch-dir` option to specify the subfolder
where the patch should be saved. For example, edit `node_modules/exit-hook/index.js` and then
run:
```sh
$ npx patch-package --patch-dir patches/all exit-hook
```
That said, these kinds of patches should be avoided in favour of creating pull requests
upstream. Patch files create additional maintenance work over time as the patches need to be
updated when the dependencies are updated, and they prevent the compounding community benefit
that sharing fixes upstream have on open source projects like the balena CLI. The typical
scenario where these patches are used is when the upstream maintainers are unresponsive or
unwilling to merge the required fixes, the fixes are very small and specific to the balena CLI,
and creating a fork of the upstream repo is likely to be more long-term effort than maintaining
the patches.
## Windows
Besides the regular npm installation dependencies, the `npm run build:installer` script
that produces the `.exe` graphical installer on Windows also requires
[NSIS](https://sourceforge.net/projects/nsis/) and [MSYS2](https://www.msys2.org/) to be
installed. Be sure to add `C:\Program Files (x86)\NSIS` to the PATH, so that `makensis`
is available. MSYS2 is recommended when developing the balena CLI on Windows.
If changes are made to npm scripts in `package.json`, don't assume that a Unix shell like
bash is available. For example, some Windows shells don't have the `cp` and `rm` commands,
which is why you'll often find `ncp` and `rimraf` used in `package.json` scripts.
## Updating the 'npm-shrinkwrap.json' file
The `npm-shrinkwrap.json` file is used to control package dependencies, as documented at
https://docs.npmjs.com/files/shrinkwrap.json.
Changes to `npm-shrinkwrap.json` can be automatically merged by git during operations like
`rebase`, `pull` and `cherry-pick`, but in some cases this results in suboptimal dependency
resolution (the `node_modules` folder may end up larger than necessary, with consequences to CLI
load time too). For this reason, the recommended way to update `npm-shrinkwrap.json` is to run
`npm install`, possibly alongside `npm dedupe` as well. The following commands can be used to
fix shrinkwrap issues and optimize the dependencies:
```sh
git checkout master -- npm-shrinkwrap.json
rm -rf node_modules
npm install # update npm-shrinkwrap.json to satisfy changes to package.json
npm dedupe # deduplicate dependencies from npm-shrinkwrap.json
npm install # re-add optional dependencies removed by dedupe
git add npm-shrinkwrap.json # add it for committing (solve merge errors)
```
Note that `npm dedupe` should always be followed by `npm install`, as shown above, even if
`npm install` had already been executed before `npm dedupe`.
Optionally, these steps may be automated by installing the
[npm-merge-driver](https://www.npmjs.com/package/npm-merge-driver):
```sh
npx npm-merge-driver install -g
```
## `fast-boot` and `npm link` - modifying the `node_modules` folder
During development or debugging, it is sometimes useful to temporarily modify the `node_modules`
folder (with or without making the respective changes to the `npm-shrinkwrap.json` file),
replacing dependencies with different versions. This can be achieved with the `npm link`
command, or by manually editing or copying files to the `node_modules` folder.
Unexpected behavior may then be observed because of the CLI's use of the
[fast-boot2](https://www.npmjs.com/package/fast-boot2) package that caches module resolution.
`fast-boot2` is configured in `lib/fast-boot.ts` to automatically invalidate the cache if
changes are made to the `package.json` or `npm-shrinkwrap.json` files, but the cache won't
be automatically invalidated if `npm link` is used or if manual modifications are made to the
`node_modules` folder. In this situation:
* Manually delete the module cache file (typically `~/.balena/cli-module-cache.json`), or
* Use the `bin/balena-dev` entry point (instead of `bin/balena`) as it does not activate
`fast-boot2`.
## TypeScript and oclif
The CLI currently contains a mix of plain JavaScript and
[TypeScript](https://www.typescriptlang.org/) code. The goal is to have all code written in
Typescript, in order to take advantage of static typing and formal programming interfaces.
The migration towards Typescript is taking place gradually, as part of maintenance work or
the implementation of new features.
Of historical interest, the CLI was originally written in [CoffeeScript](https://coffeescript.org)
and used the [Capitano](https://github.com/balena-io/capitano) framework. All CoffeeScript code was
migrated to either Javascript or Typescript, and Capitano was replaced with oclif. A few file or
variable names still refer to this legacy, for example `automation/capitanodoc/capitanodoc.ts`.
## Programming style
`npm run build` also runs [balena-lint](https://www.npmjs.com/package/@balena/lint), which automatically
reformats the code. Beyond that, we have a preference for Javascript promises over callbacks, and for
`async/await` over `.then()`.
## Common gotchas
One thing that most CLI bugs have in common is the absence of test cases exercising the broken
code, so writing some test code is a great idea. Having said that, there are also some common
gotchas to bear in mind:
* Forward slashes ('/') _vs._ backslashes ('\') in file paths. The Node.js
[path.sep](https://nodejs.org/docs/latest-v12.x/api/path.html#path_path_sep) variable stores a
platform-specific path separator character: the backslash on Windows and the forward slash on
Linux and macOS. The
[path.join](https://nodejs.org/docs/latest-v12.x/api/path.html#path_path_join_paths) function
builds paths using such platform-specific path separator. However:
* Note that Windows (kernel, cmd.exe, PowerShell, many applications) accepts ***both*** forward
slashes and backslashes as path separators (including mixing them in a path string), so code
like `mypath.split(path.sep)` may fail on Windows if `mypath` contains forward slashes. The
[path.parse](https://nodejs.org/docs/latest-v12.x/api/path.html#path_path_parse_path) function
understands both forward slashes and backslashes on Windows, and the
[path.normalize](https://nodejs.org/docs/latest-v12.x/api/path.html#path_path_normalize_path)
function will _replace_ forward slashes with backslashes.
* In [tar](https://en.wikipedia.org/wiki/Tar_(computing)#File_format) streams sent to the Docker
daemon and to balenaCloud, the forward slash is the only acceptable path separator, regardless
of the OS where the CLI is running. Therefore, `path.sep` and `path.join` should never be used
when handling paths in tar streams! `path.posix.join` may be used instead of `path.join`.
* Avoid using the system shell to execute external commands, for example:
`child_process.exec('ssh "arg1" "arg2"');`
`child_process.spawn('ssh "arg1" "arg2"', { shell: true });`
Besides the usual security concerns of unsanitized strings, another problem is to get argument
escaping right because of the differences between the Windows 'cmd.exe' shell and the Unix
'/bin/sh'. For example, 'cmd.exe' doesn't recognize single quotes like '/bin/sh', and uses the
caret (^) instead of the backslash as the escape character. Bug territory! Most of the time,
it is possible to avoid relying on the shell altogether by providing a Javascript array of
arguments:
`spawn('ssh', ['arg1', 'arg2'], { shell: false});`
To allow for logging and debugging, the [which](https://www.npmjs.com/package/which) package may
be used to get the full path of a command before executing it, without relying on any shell:
`const fullPath = await which('ssh');`
`console.log(fullPath); # 'C:\WINDOWS\System32\OpenSSH\ssh.EXE'`
`spawn(fullPath, ['arg1', 'arg2'], { shell: false });`
* Avoid the `instanceof` operator when testing against classes/types from external packages
(including base classes), because `npm install` may result in multiple versions of the same
package being installed (to satisfy declared dependencies) and a false negative may result when
comparing an object instance from one package version with a class of another package version
(even if the implementations are identical in both packages). For example, once we fixed a bug
where the test:
`error instanceof BalenaApplicationNotFound`
changed from true to false because `npm install` added an additional copy of the `balena-errors`
package to satisfy a minor `balena-sdk` version update:
`$ find node_modules -name balena-errors`
`node_modules/balena-errors`
`node_modules/balena-sdk/node_modules/balena-errors`
In the case of subclasses of `TypedError`, a string comparison may be used instead:
`error.name === 'BalenaApplicationNotFound'`
## Further debugging notes
* If you need to selectively run specific tests, `it.only` will not work in cases when authorization is required as part of the test cycle. In order to target specific tests, control execution via `.mocharc.js` instead. Here is an example of targeting the `deploy` tests.
replace: `spec: 'tests/**/*.spec.ts',`
with: `spec: ['tests/auth/*.spec.ts', 'tests/**/deploy.spec.ts'],`

View File

@ -1,169 +0,0 @@
# balena CLI Advanced Installation Options
**These are alternative, advanced installation options. Most users would prefer the [recommended,
streamlined installation
instructions](https://github.com/balena-io/balena-cli/blob/master/INSTALL.md).**
There are 3 options to choose from to install balena's CLI:
* [Executable Installer](#executable-installer): the easiest method on Windows and macOS, using the
traditional graphical desktop application installers.
* [Standalone Zip Package](#standalone-zip-package): these are plain zip files with the balena CLI
executable in them: extract and run. Available for all platforms: Linux, Windows, macOS.
Recommended also for scripted installation in CI (continuous integration) environments.
* [NPM Installation](#npm-installation): recommended for Node.js developers who may be interested
in integrating the balena CLI in their existing projects or workflow.
Some specific CLI commands have a few extra installation steps: see section [Additional
Dependencies](#additional-dependencies).
## Executable Installer
This is the recommended installation option on macOS and Windows. Follow the specific OS
instructions:
* [Windows](./INSTALL-WINDOWS.md)
* [macOS](./INSTALL-MAC.md)
> Note regarding WSL ([Windows Subsystem for
> Linux](https://docs.microsoft.com/en-us/windows/wsl/about))
> If you would like to use WSL, follow the [installations instructions for
> Linux](./INSTALL-LINUX.md) rather than Windows, as WSL consists of a Linux environment.
If you had previously installed the CLI using a standalone zip package, it may be a good idea to
check your system's `PATH` environment variable for duplicate entries, as the terminal will use the
entry that comes first. Check the [Standalone Zip Package](#standalone-zip-package) instructions
for how to modify the PATH variable.
By default, the CLI is installed to the following folders:
OS | Folders
--- | ---
Windows: | `C:\Program Files\balena-cli\`
macOS: | `/usr/local/lib/balena-cli/` <br> `/usr/local/bin/balena`
## Standalone Zip Package
1. Download the latest zip file from the [releases page](https://github.com/balena-io/balena-cli/releases).
Look for a file name that ends with the word "standalone", for example:
`balena-cli-vX.Y.Z-linux-x64-standalone.zip`_also for the Windows Subsystem for Linux_
`balena-cli-vX.Y.Z-macOS-x64-standalone.zip`
`balena-cli-vX.Y.Z-windows-x64-standalone.zip`
2. Extract the zip file contents to any folder you choose. The extracted contents will include a
`balena-cli` folder.
3. Add the `balena-cli` folder to the system's `PATH` environment variable.
See instructions for:
[Linux](https://stackoverflow.com/questions/14637979/how-to-permanently-set-path-on-linux-unix) |
[macOS](https://www.architectryan.com/2012/10/02/add-to-the-path-on-mac-os-x-mountain-lion/#.Uydjga1dXDg) |
[Windows](https://www.computerhope.com/issues/ch000549.htm)
> * If you are using macOS 10.15 or later (Catalina, Big Sur), [check this known issue and
> workaround](https://github.com/balena-io/balena-cli/issues/2244).
> * **Linux Alpine** and **Busybox:** the standalone zip package is not currently compatible with
> these "compact" Linux distributions, because of the alternative C libraries they ship with.
> For these, consider the [NPM Installation](#npm-installation) option.
> * Note that moving the `balena` executable out of the extracted `balena-cli` folder on its own
> (e.g. moving it to `/usr/local/bin/balena`) will **not** work, as it depends on the other
> folders and files also present in the `balena-cli` folder.
To update the CLI to a new version, download a new release zip file and replace the previous
installation folder. To uninstall, simply delete the folder and edit the PATH environment variable
as described above.
## NPM Installation
If you are a Node.js developer, you may wish to install the balena CLI via [npm](https://www.npmjs.com).
The npm installation involves building native (platform-specific) binary modules, which require
some development tools to be installed first, as follows.
> **The balena CLI currently requires Node.js version 12 (min 12.8.0).**
> **Versions 13 and later are not yet fully supported.**
### Install development tools
#### **Linux or WSL** (Windows Subsystem for Linux)
```sh
$ sudo apt-get update && sudo apt-get -y install curl python3 git make g++
$ curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.38.0/install.sh | bash
$ . ~/.bashrc
$ nvm install 12
```
The `curl` command line above uses
[nvm](https://github.com/nvm-sh/nvm/blob/master/README.md#install--update-script) to install
Node.js, instead of using `apt-get`. Installing Node.js through `apt-get` is a common source of
problems from permission errors to conflict with other system packages, and therefore not
recommended.
#### **macOS**
* Download and install Apple's Command Line Tools from https://developer.apple.com/downloads/
* Install Node.js through [nvm](https://github.com/nvm-sh/nvm/blob/master/README.md#install--update-script):
```sh
$ curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.38.0/install.sh | bash
$ . ~/.bashrc
$ nvm install 12
```
#### **Windows** (not WSL)
Install:
* Node.js v12 from the [Nodejs.org releases page](https://nodejs.org/en/download/releases/).
* If you'd like the ability to switch between Node.js versions, install
[nvm-windows](https://github.com/coreybutler/nvm-windows#node-version-manager-nvm-for-windows)
instead.
* The [MSYS2 shell](https://www.msys2.org/), which provides `git`, `make`, `g++` and more:
* `pacman -S git gcc make openssh p7zip`
* [Set a Windows environment variable](https://www.onmsft.com/how-to/how-to-set-an-environment-variable-in-windows-10): `MSYS2_PATH_TYPE=inherit`
* Note that a bug in the MSYS2 launch script (`msys2_shell.cmd`) makes text-based
interactive CLI menus to misbehave. [Check this Github issue for a
workaround](https://github.com/msys2/MINGW-packages/issues/1633#issuecomment-240583890).
* The Windows Driver Kit (WDK), which is needed to compile some native Node modules. It is **not**
necessary to install Visual Studio, only the WDK, which is "step 2" in the following guides:
* [WDK for Windows 10](https://docs.microsoft.com/en-us/windows-hardware/drivers/download-the-wdk#download-icon-step-2-install-refreshed-wdk-for-windows-10-version-2004)
* [WDK for earlier versions of Windows](https://docs.microsoft.com/en-us/windows-hardware/drivers/other-wdk-downloads#step-2-install-the-wdk)
* The [windows-build-tools](https://www.npmjs.com/package/windows-build-tools) npm package,
by running the following command on an [administrator
console](https://www.howtogeek.com/194041/how-to-open-the-command-prompt-as-administrator-in-windows-8.1/):
`npm install --global --production windows-build-tools`
### Install the balena CLI
After installing the development tools, install the balena CLI with:
```sh
$ npm install balena-cli --global --production --unsafe-perm
```
`--unsafe-perm` is needed when `npm install` is executed as the `root` user (e.g. in a Docker
container) in order to allow npm scripts like `postinstall` to be executed.
## Additional Dependencies
The `balena ssh`, `scan`, `build`, `deploy` and `preload` commands may require
additional software to be installed. Check the Additional Dependencies sections for each operating
system:
* [Windows](./INSTALL-WINDOWS.md#additional-dependencies)
* [macOS](./INSTALL-MAC.md#additional-dependencies)
* [Linux](./INSTALL-LINUX.md#additional-dependencies)
Where Docker or balenaEngine are required, they may be installed on the local machine (where the
balena CLI is executed), on a remote server, or on a balenaOS device running a [balenaOS development
image](https://www.balena.io/docs/reference/OS/overview/2.x/#dev-vs-prod-images). Reasons why this
may be desirable include:
* To avoid having to install Docker on the development machine / laptop.
* To take advantage of a more powerful server (CPU, memory).
* To build or run images "natively" on an ARM device, avoiding the need for QEMU emulation.
To use a remote Docker Engine (daemon) or balenaEngine, specify the remote machine's IP address and
port number with the `--dockerHost` and `--dockerPort` command-line options. The `preload` command
has additional requirements because the bind mount feature is used. For more details, see
`balena help` for each command or the [online
reference](https://www.balena.io/docs/reference/cli/#cli-command-reference).

View File

@ -1,85 +0,0 @@
# balena CLI Installation Instructions for Linux
These instructions are suitable for most Linux distributions on Intel x86, such as
Ubuntu, Debian, Fedora, Arch Linux and other glibc-based distributions.
For the ARM architecture and for Linux distributions not based on glibc, such as
Alpine Linux, follow the [NPM Installation](./INSTALL-ADVANCED.md#npm-installation)
method.
Selected operating system: **Linux**
1. Download the latest zip file from the [latest release
page](https://github.com/balena-io/balena-cli/releases/latest). Look for a file name that ends
with "-standalone.zip", for example:
`balena-cli-vX.Y.Z-linux-x64-standalone.zip`
2. Extract the zip file contents to any folder you choose, for example `/home/james`.
The extracted contents will include a `balena-cli` folder.
3. Add that folder (e.g. `/home/james/balena-cli`) to the `PATH` environment variable.
Check this [StackOverflow
post](https://stackoverflow.com/questions/14637979/how-to-permanently-set-path-on-linux-unix)
for instructions. Close and reopen the terminal window so that the changes to `PATH`
can take effect.
4. Check that the installation was successful by running the following commands on a
terminal window:
* `balena version` - should print the CLI's version
* `balena help` - should print a list of available commands
To update the balena CLI to a new version, download a new release zip file and replace the previous
installation folder. To uninstall, simply delete the folder and edit the PATH environment variable
as described above.
## sudo configuration
A few CLI commands require execution through sudo, e.g. `sudo balena scan`.
If your Linux distribution has an `/etc/sudoers` file that defines a `secure_path`
setting, run `sudo visudo` to edit it and add the balena CLI's installation folder to
the ***pre-existing*** `secure_path` setting, for example:
```text
Defaults secure_path="/home/james/balena-cli:<pre-existing entries go here>"
```
If an `/etc/sudoers` file does not exist, or if it does not contain a pre-existing
`secure_path` setting, do not change it.
If you also have Docker installed, ensure that it can be executed ***without*** `sudo`, so that
CLI commands like `balena build` and `balena preload` can also be executed without `sudo`.
Check Docker's [post-installation
steps](https://docs.docker.com/engine/install/linux-postinstall/) on how to achieve this.
## Additional Dependencies
### build, deploy
These commands require [Docker](https://docs.docker.com/install/overview/) or
[balenaEngine](https://www.balena.io/engine/) to be available on a local or remote
machine. Most users will follow [Docker's installation
instructions](https://docs.docker.com/install/overview/) to install Docker on the same
workstation as the balena CLI. The [advanced installation
options](./INSTALL-ADVANCED.md#additional-dependencies) document describes other possibilities.
### balena ssh
The `balena ssh` command requires the `ssh` command-line tool to be available. Most Linux
distributions will already have it installed. Otherwise, `sudo apt-get install openssh-client`
should do the trick on Debian or Ubuntu.
The `balena ssh` command also requires an SSH key to be added to your balena account: see [SSH
Access documentation](https://www.balena.io/docs/learn/manage/ssh-access/). The `balena key*`
command set can also be used to list and manage SSH keys: see `balena help -v`.
### balena scan
The `balena scan` command requires a multicast DNS (mDNS) service like
[Avahi](https://en.wikipedia.org/wiki/Avahi_(software)), which is installed by default on most
desktop Linux distributions. Otherwise, on Debian or Ubuntu, the installation command would be
`sudo apt-get install avahi-daemon`.
### balena preload
Like the `build` and `deploy` commands, the `preload` command requires Docker, with the additional
restriction that Docker must be installed on the local machine (because Docker's bind mounting
feature is used).

View File

@ -1,74 +0,0 @@
# balena CLI Installation Instructions for macOS
These instructions are for the recommended installation option. Advanced users may also be
interested in [advanced installation options](./INSTALL-ADVANCED.md).
Selected operating system: **macOS**
1. Download the installer from the [latest release
page](https://github.com/balena-io/balena-cli/releases/latest).
Look for a file name that ends with "-installer.pkg":
`balena-cli-vX.Y.Z-macOS-x64-installer.pkg`
2. Double click on the downloaded file to run the installer and follow the installer's
instructions.
3. Check that the installation was successful:
- [Open the Terminal
app](https://support.apple.com/en-gb/guide/terminal/apd5265185d-f365-44cb-8b09-71a064a42125/mac).
- On the terminal prompt, type `balena version` and hit Enter. It should display
the version of the balena CLI that you have installed.
No further steps are required to run most CLI commands. The `balena ssh`, `build`, `deploy`
and `preload` commands may require additional software to be installed, as described
in the next section.
To update the balena CLI, repeat the steps above for the new version.
To uninstall it, run the following command on a terminal prompt:
```text
sudo /usr/local/lib/balena-cli/bin/uninstall
```
## Additional Dependencies
### build and deploy
These commands require [Docker](https://docs.docker.com/install/overview/) or
[balenaEngine](https://www.balena.io/engine/) to be available on a local or remote
machine. Most users will follow [Docker's installation
instructions](https://docs.docker.com/install/overview/) to install Docker on the same
workstation as the balena CLI. The [advanced installation
options](./INSTALL-ADVANCED.md#additional-dependencies) document describes other possibilities.
### balena ssh
The `balena ssh` command requires the `ssh` command-line tool to be available. To check whether
it is already installed, run `ssh` on a Terminal window. If it is not yet installed, the options
include:
* Download the Xcode Command Line Tools from https://developer.apple.com/downloads
* Or, if you have Xcode installed, open Xcode, choose Preferences → General → Downloads →
Components → Command Line Tools → Install.
* Or, install [Homebrew](https://brew.sh/), then `brew install openssh`
The `balena ssh` command also requires an SSH key to be added to your balena account: see [SSH
Access documentation](https://www.balena.io/docs/learn/manage/ssh-access/). The `balena key*`
command set can also be used to list and manage SSH keys: see `balena help -v`.
### balena preload
Like the `build` and `deploy` commands, the `preload` command requires Docker.
Preloading balenaOS images for some older device types (like the Raspberry
Pi 3, but not the Raspberry 4) requires Docker to support the [AUFS storage
driver](https://docs.docker.com/storage/storagedriver/aufs-driver/). Unfortunately, Docker Desktop
for Windows and macOS dropped support for the AUFS filesystem in Docker CE versions greater than
18.06.1. The present workarounds are to either:
* Install the balena CLI on Linux (e.g. Ubuntu) with a virtual machine like VirtualBox.
This works because Docker for Linux still supports AUFS. Hint: if using a virtual machine,
copy the image file over, rather than accessing it through "file sharing", to avoid errors.
* Downgrade Docker Desktop to version 18.06.1. Link: [Docker CE for
Mac](https://docs.docker.com/docker-for-mac/release-notes/#docker-community-edition-18061-ce-mac73-2018-08-29)
We are working on replacing AUFS with overlay2 in balenaOS images of the affected device types.

View File

@ -1,73 +0,0 @@
# balena CLI Installation Instructions for Windows
These instructions are for the recommended installation option. Advanced users may also be
interested in [advanced installation options](./INSTALL-ADVANCED.md).
Selected operating system: **Windows**
1. Download the installer from the [latest release
page](https://github.com/balena-io/balena-cli/releases/latest).
Look for a file name that ends with "-installer.exe":
`balena-cli-vX.Y.Z-windows-x64-installer.exe`
2. Double click on the downloaded file to run the installer and follow the installer's
instructions.
3. Check that the installation was successful:
- Click on the Windows Start Menu, type PowerShell, and then click
on Windows PowerShell.
- On the command prompt, type `balena version` and hit Enter. It should display
the version of the balena CLI that you have installed.
No further steps are required to run most CLI commands. The `balena ssh`, `scan`, `build`,
`deploy` and `preload` commands may require additional software to be installed, as
described below.
## Additional Dependencies
### build and deploy
These commands require [Docker](https://docs.docker.com/install/overview/) or
[balenaEngine](https://www.balena.io/engine/) to be available on a local or remote
machine. Most users will follow [Docker's installation
instructions](https://docs.docker.com/install/overview/) to install Docker on the same
workstation as the balena CLI. The [advanced installation
options](./INSTALL-ADVANCED.md#additional-dependencies) document describes other possibilities.
### balena ssh
The `balena ssh` command requires the `ssh` command-line tool to be available. Microsoft started
distributing an SSH client with Windows 10, which is automatically installed through Windows
Update. To check whether it is installed, run `ssh` on a Windows Command Prompt or PowerShell. It
can also be [manually
installed](https://docs.microsoft.com/en-us/windows-server/administration/openssh/openssh_install_firstuse)
if needed. For older versions of Windows, there are several ssh/OpenSSH clients provided by 3rd
parties.
The `balena ssh` command also requires an SSH key to be added to your balena account: see [SSH
Access documentation](https://www.balena.io/docs/learn/manage/ssh-access/). The `balena key*`
command set can also be used to list and manage SSH keys: see `balena help -v`.
### balena scan
The `balena scan` command requires a multicast DNS (mDNS) service like Apple's Bonjour.
Many Windows machines will already have this service installed, as it is bundled in popular
applications such as Skype (Wikipedia lists [several others](https://en.wikipedia.org/wiki/Bonjour_(software))).
Otherwise, Bonjour for Windows can be downloaded and installed from: https://support.apple.com/kb/DL999
### balena preload
Like the `build` and `deploy` commands, the `preload` command requires Docker.
Preloading balenaOS images for some older device types (like the Raspberry
Pi 3, but not the Raspberry 4) requires Docker to support the [AUFS storage
driver](https://docs.docker.com/storage/storagedriver/aufs-driver/). Unfortunately, Docker Desktop
for Windows and macOS dropped support for the AUFS filesystem in Docker CE versions greater than
18.06.1. The present workarounds are to either:
* Install the balena CLI on Linux (e.g. Ubuntu) with a virtual machine like VirtualBox.
This works because Docker for Linux still supports AUFS. Hint: if using a virtual machine,
copy the image file over, rather than accessing it through "file sharing", to avoid errors.
* Downgrade Docker Desktop to version 18.06.1. Link: [Docker CE for
Windows](https://docs.docker.com/docker-for-windows/release-notes/#docker-community-edition-18061-ce-win73-2018-08-29)
We are working on replacing AUFS with overlay2 in balenaOS images of the affected device types.

View File

@ -1,12 +0,0 @@
# balena CLI Installation Instructions
Please select your operating system:
* [Windows](./INSTALL-WINDOWS.md)
* [macOS](./INSTALL-MAC.md)
* [Linux](./INSTALL-LINUX.md)
> Note regarding WSL ([Windows Subsystem for
> Linux](https://docs.microsoft.com/en-us/windows/wsl/about))
> If you would like to use WSL, follow the installations instructions for Linux
> rather than Windows, as WSL consists of a Linux environment.

221
README.md
View File

@ -1,181 +1,126 @@
# balena CLI
Resin CLI
=========
The official balena Command Line Interface.
> The official resin.io CLI tool.
[![npm version](https://badge.fury.io/js/balena-cli.svg)](http://badge.fury.io/js/balena-cli)
[![dependencies](https://david-dm.org/balena-io/balena-cli.svg)](https://david-dm.org/balena-io/balena-cli)
[![npm version](https://badge.fury.io/js/resin-cli.svg)](http://badge.fury.io/js/resin-cli)
[![dependencies](https://david-dm.org/resin-io/resin-cli.svg)](https://david-dm.org/resin-io/resin-cli)
[![Gitter](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/resin-io/chat)
## About
Requisites
----------
The balena CLI is a Command Line Interface for [balenaCloud](https://www.balena.io/cloud/) or
[openBalena](https://www.balena.io/open/). It is a software tool available for Windows, macOS and
Linux, used through a command prompt / terminal window. It can be used interactively or invoked in
scripts. The balena CLI builds on the [balena API](https://www.balena.io/docs/reference/api/overview/)
and the [balena SDK](https://www.balena.io/docs/reference/sdk/node-sdk/), and can also be directly
imported in Node.js applications. The balena CLI is an [open-source project on
GitHub](https://github.com/balena-io/balena-cli/), and your contribution is also welcome!
If you want to install the CLI directly through npm, you'll need the below. If this looks difficult,
we do now have an experimental standalone binary release available, see ['Standalone install'](#standalone-install) below.
## Installation
- [NodeJS](https://nodejs.org) (>= v6)
- [Git](https://git-scm.com)
- The following executables should be correctly installed in your shell environment:
- `ssh`: Any recent version of the OpenSSH ssh client (required by `resin sync` and `resin ssh`)
- if you need `ssh` to work behind the proxy you also need [`proxytunnel`](http://proxytunnel.sourceforge.net/) installed (available as `proxytunnel` package for Ubuntu, for example)
- `rsync`: >= 2.6.9 (required by `resin sync`)
Check the [balena CLI installation instructions on
GitHub](https://github.com/balena-io/balena-cli/blob/master/INSTALL.md).
##### Windows Support
## Choosing a shell (command prompt/terminal)
Before installing resin-cli, you'll need a working node-gyp environment. If you don't already have one you'll see native module build errors during installation. To fix this, run `npm install -g --production windows-build-tools` in an administrator console (available as 'Command Prompt (Admin)' when pressing windows+x in Windows 7+).
On **Windows,** the standard Command Prompt (`cmd.exe`) and
[PowerShell](https://docs.microsoft.com/en-us/powershell/scripting/getting-started/getting-started-with-windows-powershell?view=powershell-6)
are supported. Alternative shells include:
`resin sync` and `resin ssh` have not been thoroughly tested on the standard Windows cmd.exe shell. We recommend using bash (or a similar) shell, like Bash for Windows 10 or [Git for Windows](https://git-for-windows.github.io/).
* [MSYS2](https://www.msys2.org/):
* Install additional packages with the command:
`pacman -S git gcc make openssh p7zip`
* [Set a Windows environment variable](https://www.onmsft.com/how-to/how-to-set-an-environment-variable-in-windows-10): `MSYS2_PATH_TYPE=inherit`
* Note that a bug in the MSYS2 launch script (`msys2_shell.cmd`) makes text-based interactive CLI
menus to break. [Check this Github issue for a
workaround](https://github.com/msys2/MINGW-packages/issues/1633#issuecomment-240583890).
If you still want to use `cmd.exe` you will have to use a package manager like MinGW or chocolatey. For MinGW the steps are:
* [MSYS](http://www.mingw.org/wiki/MSYS)
* [Git for Windows](https://git-for-windows.github.io/)
* During the installation, you will be prompted to choose between _"Use MinTTY"_ and _"Use
Windows' default console window"._ Choose the latter, because of the same [MSYS2
bug](https://github.com/msys2/MINGW-packages/issues/1633) mentioned above (Git for Windows
actually uses MSYS2). For a screenshot, check this
[comment](https://github.com/balena-io/balena-cli/issues/598#issuecomment-556513098).
1. Install [MinGW](http://www.mingw.org).
2. Install the `msys-rsync` and `msys-openssh` packages.
3. Add MinGW to the `%PATH%` if this hasn't been done by the installer already. The location where the binaries are places is usually `C:\MinGW\msys\1.0\bin`, but it can vary if you selected a different location in the installer.
4. Copy your SSH keys to `%homedrive%%homepath\.ssh`.
5. If you need `ssh` to work behind the proxy you also need to install [proxytunnel](http://proxytunnel.sourceforge.net/)
* Microsoft's [Windows Subsystem for Linux](https://docs.microsoft.com/en-us/windows/wsl/about)
(WSL). In this case, a Linux distribution like Ubuntu is installed via the Microsoft Store, and a
balena CLI release **for Linux** should be selected. See
[FAQ](https://github.com/balena-io/balena-cli/blob/master/TROUBLESHOOTING.md) for using the
balena CLI with WSL and Docker Desktop for Windows.
Getting Started
---------------
On **macOS** and **Linux,** the standard terminal window is supported. Optionally, `bash` command
auto completion may be enabled by copying the
[balena_comp](https://github.com/balena-io/balena-cli/blob/master/completion/balena-completion.bash)
file to your system's `bash_completion` directory: check [Docker's command completion
guide](https://docs.docker.com/compose/completion/) for system setup instructions.
### NPM install
## Logging in
If you've got all the requirements above, you should be able to install the CLI directly from npm. If not,
or if you have any trouble with this, please try the new standalone install steps just below.
Several CLI commands require access to your balenaCloud account, for example in order to push a
new release to your fleet. Those commands require creating a CLI login session by running:
This might require elevated privileges in some environments.
```sh
$ balena login
$ npm install resin-cli -g --production --unsafe-perm
```
## Proxy support
`--unsafe-perm` is only required on systems where the global install directory is not user-writable.
This allows npm install steps to download and save prebuilt native binaries. You may be able to omit it,
especially if you're using a user-managed node install such as [nvm](https://github.com/creationix/nvm).
HTTP(S) proxies can be configured through any of the following methods, in precedence order
(from higher to lower):
In some environments, this process will need to build native modules. This may require a more complex build
environment, and notably requires Python 2.7. If you hit any problems with this, we recommend you try the
alternative standalone install below instead.
* The `BALENARC_PROXY` environment variable in URL format, with protocol (`http` or `https`),
host, port and optionally basic auth. Examples:
* `export BALENARC_PROXY='https://bob:secret@proxy.company.com:12345'`
* `export BALENARC_PROXY='http://localhost:8000'`
### Standalone install
* The `proxy` setting in the [CLI config
file](https://www.npmjs.com/package/balena-settings-client#documentation). It may be:
* A string in URL format, e.g. `proxy: 'http://localhost:8000'`
* An object in the format:
```yaml
proxy:
protocol: 'http'
host: 'proxy.company.com'
port: 12345
proxyAuth: 'bob:secret'
```
If you don't have node or a working pre-gyp environment, you can still install the CLI as a standalone
binary. **This is experimental and may not work perfectly yet in all environments**, but it seems to work
well in initial cross-platform testing, so it may be useful, and we'd love your feedback if you hit any issues.
* The `HTTPS_PROXY` and/or `HTTP_PROXY` environment variables, in the same URL format as
`BALENARC_PROXY`.
To install the CLI as a standalone binary:
### Proxy setup for balena ssh
* Download the latest zip for your OS from https://github.com/resin-io/resin-cli/releases.
* Extract the contents, putting the `resin-cli` folder somewhere appropriate for your system (e.g. `C:/resin-cli`, `/usr/local/lib/resin-cli`, etc).
* Add the `resin-cli` folder to your `PATH` ([Windows instructions](https://www.computerhope.com/issues/ch000549.htm), [Linux instructions](https://stackoverflow.com/questions/14637979/how-to-permanently-set-path-on-linux-unix), [OSX instructions](https://stackoverflow.com/questions/22465332/setting-path-environment-variable-in-osx-permanently))
* Running `resin` in a fresh command line should print the resin CLI help.
In order to work behind a proxy server, the `balena ssh` command requires the
[`proxytunnel`](http://proxytunnel.sourceforge.net/) package (command-line tool) to be installed.
`proxytunnel` is available for Linux distributions like Ubuntu/Debian (`apt install proxytunnel`),
and for macOS through [Homebrew](https://brew.sh/). Windows support is limited to the [Windows
Subsystem for Linux](https://docs.microsoft.com/en-us/windows/wsl/about) (e.g., by installing
Ubuntu through the Microsoft App Store).
To update in future, simply download a new release and replace the extracted folder.
Ensure that the proxy server is configured to allow proxy requests to ssh port 22, using
SSL encryption. For example, in the case of the [Squid](http://www.squid-cache.org/) proxy
server, it should be configured with the following rules in the `squid.conf` file:
`acl SSL_ports port 22`
`acl Safe_ports port 22`
Have any problems, or see any unexpected behaviour? [Please file an issue!](https://github.com/resin-io/resin-cli/issues/new)
### Proxy exclusion
### Login
The `BALENARC_NO_PROXY` variable may be used to exclude specified destinations from proxying.
> * This feature requires CLI version 11.30.8 or later. In the case of the npm [installation
> option](https://github.com/balena-io/balena-cli/blob/master/INSTALL.md), it also requires
> Node.js version 10.16.0 or later.
> * To exclude a `balena ssh` target from proxying (IP address or `.local` hostname), the
> `--noproxy` option should be specified in addition to the `BALENARC_NO_PROXY` variable.
By default (if `BALENARC_NO_PROXY` is not defined), all [private IPv4
addresses](https://en.wikipedia.org/wiki/Private_network) and `'*.local'` hostnames are excluded
from proxying. Other hostnames that resolve to private IPv4 addresses are **not** excluded by
default, because matching takes place before name resolution.
`localhost` and `127.0.0.1` are always excluded from proxying, regardless of the value of
BALENARC_NO_PROXY.
The format of the `BALENARC_NO_PROXY` environment variable is a comma-separated list of patterns
that are matched against hostnames or IP addresses. For example:
```
export BALENARC_NO_PROXY='*.local,dev*.mycompany.com,192.168.*'
```sh
$ resin login
```
Matched patterns are excluded from proxying. Wildcard expressions are documented at
[matcher](https://www.npmjs.com/package/matcher#usage). Matching takes place _before_ name
resolution, so a pattern like `'192.168.*'` will **not** match a hostname that resolves to an IP
address like `192.168.1.2`.
_(Typically useful, but not strictly required for all commands)_
## Command reference documentation
### Run commands
The full CLI command reference is available [on the web](https://www.balena.io/docs/reference/cli/
) or by running `balena help --verbose`.
Take a look at the full command documentation at [https://docs.resin.io/tools/cli/](https://docs.resin.io/tools/cli/#table-of-contents
), or by running `resin help`.
## Support, FAQ and troubleshooting
### Bash completions
To learn more, troubleshoot issues, or to contact us for support:
Optionally you can enable tab completions for the bash shell, enabling the shell to provide additional context and automatically complete arguments to`resin`. To enable bash completions, copy the `resin-completion.bash` file to the default bash completions directory (usually `/etc/bash_completion.d/`) or append it to the end of `~/.bash_completion`.
* Check the [masterclass tutorials](https://www.balena.io/docs/learn/more/masterclasses/overview/)
* Check our [FAQ / troubleshooting document](https://github.com/balena-io/balena-cli/blob/master/TROUBLESHOOTING.md)
* Ask us a question in the [balena forums](https://forums.balena.io/c/product-support)
FAQ
---
For CLI bug reports or feature requests, check the
[CLI GitHub issues](https://github.com/balena-io/balena-cli/issues/).
### Where is my configuration file?
## Deprecation policy
The per-user configuration file lives in `$HOME/.resinrc.yml` or `%UserProfile%\_resinrc.yml`, in Unix based operating systems and Windows respectively.
The balena CLI uses [semver versioning](https://semver.org/), with the concepts
of major, minor and patch version releases.
The Resin CLI also attempts to read a `resinrc.yml` file in the current directory, which takes precedence over the per-user configuration file.
The latest release of a major version of the balena CLI will remain compatible with
the balenaCloud backend services for at least one year from the date when the
following major version is released. For example, balena CLI v11.36.0, as the
latest v11 release, would remain compatible with the balenaCloud backend for one
year from the date when v12.0.0 was released.
### How do I point the Resin CLI to staging?
Half way through to that period (6 months after the release of the next major
version), older major versions of the balena CLI will start printing a deprecation
warning message when it is used interactively (when `stderr` is attached to a TTY
device file). At the end of that period, older major versions will exit with an
error message unless the `--unsupported` flag is used. This behavior was
introduced in CLI version 12.47.0 and is also documented by `balena help`.
To take advantage of the latest backend features and ensure compatibility, users
are encouraged to regularly update the balena CLI to the latest version.
The easiest way is to set the `RESINRC_RESIN_URL=resinstaging.io` environment variable.
## Contributing (including editing documentation files)
Alternatively, you can edit your configuration file and set `resinUrl: resinstaging.io` to persist this setting.
Please have a look at the [CONTRIBUTING.md](./CONTRIBUTING.md) file for some guidance before
submitting a pull request or updating documentation (because some files are automatically
generated). Thank you for your help and interest!
### How do I make the Resin CLI persist data in another directory?
## License
The Resin CLI persists your session token, as well as cached images in `$HOME/.resin` or `%UserProfile%\_resin`.
The project is licensed under the [Apache 2.0 License](https://www.apache.org/licenses/LICENSE-2.0).
A copy is also available in the LICENSE file in this repository.
Pointing the Resin CLI to persist data in another location is necessary in certain environments, like a server, where there is no home directory, or a device running resinOS, which erases all data after a restart.
You can accomplish this by setting `RESINRC_DATA_DIRECTORY=/opt/resin` or adding `dataDirectory: /opt/resin` to your configuration file, replacing `/opt/resin` with your desired directory.
Support
-------
If you're having any problems, check our [troubleshooting guide](https://github.com/resin-io/resin-cli/blob/master/TROUBLESHOOTING.md) and if your problem is not addressed there, please [raise an issue](https://github.com/resin-io/resin-cli/issues/new) on GitHub and the resin.io team will be happy to help.
You can also get in touch with us in the resin.io [forums](https://forums.resin.io/).
License
-------
The project is licensed under the Apache 2.0 license.

View File

@ -1,131 +1,59 @@
# balena CLI FAQ & Troubleshooting
Troubleshooting
===============
## Where is the balena CLI's configuration file located?
This document contains common issues related to the Resin CLI, and how to fix them.
The per-user configuration file lives in `$HOME/.balenarc.yml` or `%UserProfile%\_balenarc.yml`, in
Unix based operating systems and Windows respectively.
### After burning to an sdcard, my device doesn't boot
The balena CLI also attempts to read a `balenarc.yml` file in the current directory, which takes
precedence over the per-user configuration file.
- The downloaded image is not complete (download was interrupted).
## How do I point the balena CLI to the staging environment?
Please clean the cache (`%HOME/.resin/cache` or `C:\Users\<user>\_resin\cache`) and run the command again. In the future, the CLI will check that the image is not complete and clean the cache for you.
Set the `BALENARC_BALENA_URL=balena-staging.com` environment variable, or add
`balenaUrl: balena-staging.com` to the balena CLI's configuration file.
### I get a permission error when burning to an sdcard
## How do I make the balena CLI persist data in another directory?
- The SDCard is locked.
The balena CLI persists the session token, as well as cached assets, to `$HOME/.balena` or
`%UserProfile%\_balena`. This directory can be changed by setting an environment variable,
`BALENARC_DATA_DIRECTORY=/opt/balena`, or by adding `dataDirectory: /opt/balena` to the CLI's
configuration file, replacing `/opt/balena` with the desired directory.
### I get EINVAL errors on Cygwin
## After burning to an SD card, my device doesn't boot
Check whether the downloaded image is incomplete (download was interrupted) or corrupted.
Try clearing the cache (`%HOME/.balena/cache` or `C:\Users\<user>\_balena\cache`) and running the
command again.
## I get a permission error when burning to an SD card
Check whether the SD card is locked (a physical switch on the side of the card).
## I get `connect ETIMEDOUT` with `balena tunnel`
Please update the CLI to the latest version. This issue was fixed in v12.38.5.
For more details, see: https://github.com/balena-io/balena-cli/issues/2172
## I get EINVAL errors on Cygwin
The errors may look something like this:
The errors look something like this:
```
net.js:156
this._handle.open(options.fd);
^
Error: EINVAL, invalid argument
at new Socket (net.js:156:18)
at process.stdin (node.js:664:19)
at Object.Interface.createInterface (C:\cygwin\home\Juan Cruz Viotti\Projects\resin-cli\node_modules\inquirer\node_modules\readline2\index.js:31:43)
at PromptUI.UI (C:\cygwin\home\Juan Cruz Viotti\Projects\resin-cli\node_modules\inquirer\lib\ui\baseUI.js:23:40)
at new PromptUI (C:\cygwin\home\Juan Cruz Viotti\Projects\resin-cli\node_modules\inquirer\lib\ui\prompt.js:26:8)
at Object.promptModule [as prompt] (C:\cygwin\home\Juan Cruz Viotti\Projects\resin-cli\node_modules\inquirer\lib\inquirer.js:27:14)
```
Some interactive widgets don't work on `Cygwin`. On Windows, PowerShell or `cmd.exe` are better
supported. Alternative shells are [listed in the README
file](./README.md#choosing-a-shell-command-promptterminal).
- Some interactive widgets don't work on `Cygwin`. If you're running Windows, it's preferrable that you use `cmd.exe`, as `Cygwin` is [not official supported by Node.js](https://github.com/chjj/blessed/issues/56#issuecomment-42671945).
## I get `Invalid MBR boot signature` when configuring a device
### I get `Invalid MBR boot signature` when configuring a device
This error, accompanied with something like: `Expected 0xAA55, but saw 0x29FE` usually indicates a corrupted device operating system image in the cache, due to bad a internet connection during the download process.
Try clearing the cache with the following command and trying again:
```sh
$ rm -rf $HOME/.balena/cache
$ rm -rf $HOME/.resin/cache
```
Or in Windows:
```sh
> del /s /q %UserProfile%\_balena\cache
> del /s /q %UserProfile%\_resin\cache
```
## I get `EACCES: permission denied` when logging in
### I get `EACCES: permission denied` when logging in
The balena CLI stores the session token in `$HOME/.balena` or `C:\Users\<user>\_balena` in UNIX based
operating systems and Windows respectively. This error usually indicates that the user doesn't have
permissions over that directory, which can happen if the CLI was executed as the `root` user.
The Resin CLI stores the session token in `$HOME/.resin` or `C:\Users\<user>\_resin` in UNIX based operating systems and Windows respectively. This error usually indicates that the user doesn't have permissions over that directory, which can happen if you ran the Resin CLI as `root`, and thus the directory got owned by him.
Try resetting the ownership by running:
```sh
$ sudo chown -R <user> $HOME/.balena
$ sudo chown -R <user> $HOME/.resin
```
## Broken line wrapping / cursor behavior with `balena ssh`
Users sometimes come across broken line wrapping or cursor behavior in text terminals, for example
when long command lines are typed in a `balena ssh` session, or when using text editors like `vim`
or `nano`. This is not something specific to the balena CLI, being also a commonly reported issue
with standard remote terminal tools like `ssh` or `telnet`. It is often a remote shell
configuration issue (files like `/etc/profile`, `~/.bash_profile`, `~/.bash_login`, `~/.profile`
and the like on the remote machine), including UTF-8 misconfiguration, the use of unsupported ASCII
control characters in shell prompt formatting (e.g. the `$PS1` env var) or the output of tools or
log files that use colored text. The issue can sometimes be fixed by simply resizing the client
terminal window, or by running one or more of the following commands on the shell:
```sh
export TERMINAL=linux
stty sane
shopt -s checkwinsize
bind 'set horizontal-scroll-mode off'
```
Terminal multiplexer tools like GNU `screen` or `tmux` are sometimes reported to fix the issues, though at other times they are reported as the _cause_ of the problem. They have their own configuration files to take into account.
Further reference:
* https://stackoverflow.com/questions/1133031/shell-prompt-line-wrapping-issue
* https://superuser.com/questions/46948/any-way-to-fix-screens-mishandling-of-line-wrap-maybe-only-terminal-app
* https://unix.stackexchange.com/questions/105958/terminal-prompt-not-wrapping-correctly
* https://unix.stackexchange.com/questions/529377/terminal-long-line-wrapping
* https://github.com/microsoft/WSL/issues/1436
If nothing seems to help, consider also using a different client-side terminal application:
* Linux: xterm, KDE Konsole, GNOME Terminal
* Mac: Terminal, iTerm2
* Windows: PowerShell, PuTTY, WSL (Windows Subsystem for Linux)
## "Docker seems to be unavailable" error when using Windows Subsystem for Linux (WSL)
When running on WSL, the recommendation is to install a CLI release for Linux, like the standalone
zip package for Linux. However, commands like "balena build" will, by default, attempt to reach the
Docker daemon at the Unix socket path `/var/run/docker.sock`, while Docker Desktop for Windows uses
a Windows named pipe at `//./pipe/docker_engine` (which the Linux CLI on WSL cannot use). A
solution is:
- Open the Docker Desktop for Windows settings panel and tick the checkbox _"Expose daemon on tcp://localhost:2375 without TLS"._
- On the WSL command line, set an env var:
`export DOCKER_HOST=tcp://localhost:2375`
Alternatively, use the command-line options `-h 127.0.0.1 -p 2375` for commands like `balena build` and `balena deploy`.
Further reference:
- https://techcommunity.microsoft.com/t5/Containers/WSL-Interoperability-with-Docker/ba-p/382405
- https://forums.docker.com/t/wsl-and-docker-for-windows-cannot-connect-to-the-docker-daemon-at-tcp-localhost-2375-is-the-docker-daemon-running/63571/12

34
appveyor.yml Normal file
View File

@ -0,0 +1,34 @@
# appveyor file
# http://www.appveyor.com/docs/appveyor-yml
init:
- git config --global core.autocrlf input
cache:
- C:\Users\appveyor\.node-gyp
- '%AppData%\npm-cache'
matrix:
fast_finish: true
# what combinations to test
environment:
matrix:
- nodejs_version: 6
install:
- ps: Install-Product node $env:nodejs_version x64
- npm install -g npm@4
- set PATH=%APPDATA%\npm;%PATH%
- npm install
build: off
test_script:
- node --version
- npm --version
- cmd: npm test
deploy_script:
- IF "%APPVEYOR_REPO_TAG%" == "true" (npm run release)
- IF NOT "%APPVEYOR_REPO_TAG%" == "true" (echo 'Not tagged, skipping deploy')

570
automation/build-bin.ts Normal file → Executable file
View File

@ -1,558 +1,36 @@
/**
* @license
* Copyright 2019 Balena Ltd.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import type { JsonVersions } from '../lib/commands/version';
import { run as oclifRun } from 'oclif';
import * as archiver from 'archiver';
import * as Bluebird from 'bluebird';
import { execFile } from 'child_process';
import * as filehound from 'filehound';
import { Stats } from 'fs';
import * as fs from 'fs-extra';
import * as klaw from 'klaw';
import * as _ from 'lodash';
import * as path from 'path';
import * as rimraf from 'rimraf';
import * as semver from 'semver';
import { promisify } from 'util';
import * as fs from 'fs-extra';
import * as filehound from 'filehound';
import { exec as execPkg } from 'pkg';
import { stripIndent } from '../build/utils/lazy';
import {
diffLines,
loadPackageJson,
ROOT,
StdOutTap,
whichSpawn,
} from './utils';
const ROOT = path.join(__dirname, '..');
const execFileAsync = promisify(execFile);
console.log('Building package...\n');
export const packageJSON = loadPackageJson();
export const version = 'v' + packageJSON.version;
const arch = process.arch;
execPkg(['--target', 'host', '--output', 'build-bin/resin', 'package.json'])
.then(() =>
fs.copy(
path.join(ROOT, 'node_modules', 'opn', 'xdg-open'),
path.join(ROOT, 'build-bin', 'xdg-open'),
),
)
.then(() => {
return filehound
.create()
.paths(path.join(ROOT, 'node_modules'))
.ext(['node', 'dll'])
.find();
})
.then(nativeExtensions => {
console.log(`\nCopying to build-bin:\n${nativeExtensions.join('\n')}`);
function dPath(...paths: string[]) {
return path.join(ROOT, 'dist', ...paths);
}
interface PathByPlatform {
[platform: string]: string;
}
const standaloneZips: PathByPlatform = {
linux: dPath(`balena-cli-${version}-linux-${arch}-standalone.zip`),
darwin: dPath(`balena-cli-${version}-macOS-${arch}-standalone.zip`),
win32: dPath(`balena-cli-${version}-windows-${arch}-standalone.zip`),
};
const oclifInstallers: PathByPlatform = {
darwin: dPath('macos', `balena-${version}.pkg`),
win32: dPath('win32', `balena-${version}-${arch}.exe`),
};
const renamedOclifInstallers: PathByPlatform = {
darwin: dPath(`balena-cli-${version}-macOS-${arch}-installer.pkg`),
win32: dPath(`balena-cli-${version}-windows-${arch}-installer.exe`),
};
export const finalReleaseAssets: { [platform: string]: string[] } = {
win32: [standaloneZips['win32'], renamedOclifInstallers['win32']],
darwin: [standaloneZips['darwin'], renamedOclifInstallers['darwin']],
linux: [standaloneZips['linux']],
};
/**
* Given the output of `pkg` as a string (containing warning messages),
* diff it against previously saved output of known "safe" warnings.
* Throw an error if the diff is not empty.
*/
async function diffPkgOutput(pkgOut: string) {
const { monochrome } = await import('../tests/helpers');
const relSavedPath = path.join(
'tests',
'test-data',
'pkg',
`expected-warnings-${process.platform}.txt`,
);
const absSavedPath = path.join(ROOT, relSavedPath);
const ignoreStartsWith = [
'> pkg@',
'> Fetching base Node.js binaries',
' fetched-',
'prebuild-install WARN install No prebuilt binaries found',
];
const modulesRE =
process.platform === 'win32'
? /(?<=[ '])([A-Z]:)?\\.+?\\node_modules(?=\\)/
: /(?<=[ '])\/.+?\/node_modules(?=\/)/;
const buildRE =
process.platform === 'win32'
? /(?<=[ '])([A-Z]:)?\\.+\\build(?=\\)/
: /(?<=[ '])\/.+\/build(?=\/)/;
const cleanLines = (chunks: string | string[]) => {
const lines = typeof chunks === 'string' ? chunks.split('\n') : chunks;
return lines
.map((line: string) => monochrome(line)) // remove ASCII colors
.filter((line: string) => !/^\s*$/.test(line)) // blank lines
.filter((line: string) =>
ignoreStartsWith.every((i) => !line.startsWith(i)),
)
.map((line: string) => {
// replace absolute paths with relative paths
let replaced = line.replace(modulesRE, 'node_modules');
if (replaced === line) {
replaced = line.replace(buildRE, 'build');
}
return replaced;
});
};
pkgOut = cleanLines(pkgOut).join('\n');
const { readFile } = (await import('fs')).promises;
const expectedOut = cleanLines(await readFile(absSavedPath, 'utf8')).join(
'\n',
);
if (expectedOut !== pkgOut) {
const sep =
'================================================================================';
const diff = diffLines(expectedOut, pkgOut);
const msg = `pkg output does not match expected output from "${relSavedPath}"
Diff:
${sep}
${diff}
${sep}
Check whether the new or changed pkg warnings are safe to ignore, then update
"${relSavedPath}"
and share the result of your investigation as comments on the pull request.
Hint: the fix is often a matter of updating the 'pkg.scripts' or 'pkg.assets'
sections in the CLI's 'package.json' file, or a matter of updating the
'buildPkg' function in 'automation/build-bin.ts'. Sometimes it requires
patching dependencies: See for example 'patches/all/open+7.0.2.patch'.
${sep}
`;
throw new Error(msg);
}
}
/**
* Call `pkg.exec` to generate the standalone zip file, capturing its warning
* messages (stdout and stderr) in order to call diffPkgOutput().
*/
async function execPkg(...args: any[]) {
const { exec: pkgExec } = await import('pkg');
const outTap = new StdOutTap(true);
try {
outTap.tap();
await (pkgExec as any)(...args);
} catch (err) {
outTap.untap();
console.log(outTap.stdoutBuf.join(''));
console.error(outTap.stderrBuf.join(''));
throw err;
}
outTap.untap();
await diffPkgOutput(outTap.allBuf.join(''));
}
/**
* Use the 'pkg' module to create a single large executable file with
* the contents of 'node_modules' and the CLI's javascript code.
* Also copy a number of native modules (binary '.node' files) that are
* compiled during 'npm install' to the 'build-bin' folder, alongside
* the single large executable file created by pkg. (This is necessary
* because of a pkg limitation that does not allow binary executables
* to be directly executed from inside another binary executable.)
*/
async function buildPkg() {
const args = [
'--target',
'host',
'--output',
'build-bin/balena',
'package.json',
];
console.log('=======================================================');
console.log(`execPkg ${args.join(' ')}`);
console.log(`cwd="${process.cwd()}" ROOT="${ROOT}"`);
console.log('=======================================================');
await execPkg(args);
const paths: Array<[string, string[], string[]]> = [
// [platform, [source path], [destination path]]
['*', ['open', 'xdg-open'], ['xdg-open']],
['*', ['opn', 'xdg-open'], ['xdg-open-402']],
['darwin', ['denymount', 'bin', 'denymount'], ['denymount']],
];
await Promise.all(
paths.map(([platform, source, dest]) => {
if (platform === '*' || platform === process.platform) {
// eg copy from node_modules/open/xdg-open to build-bin/xdg-open
return fs.copy(
path.join(ROOT, 'node_modules', ...source),
path.join(ROOT, 'build-bin', ...dest),
);
}
}),
);
const nativeExtensionPaths: string[] = await filehound
.create()
.paths(path.join(ROOT, 'node_modules'))
.ext(['node', 'dll'])
.find();
console.log(`\nCopying to build-bin:\n${nativeExtensionPaths.join('\n')}`);
await Promise.all(
nativeExtensionPaths.map((extPath) =>
fs.copy(
return nativeExtensions.map((extPath: string) => {
return fs.copy(
extPath,
extPath.replace(
path.join(ROOT, 'node_modules'),
path.join(ROOT, 'build-bin'),
),
),
),
);
}
/**
* Run some basic tests on the built pkg executable.
* TODO: test more than just `balena version -j`; integrate with the
* existing mocha/chai CLI command testing.
*/
async function testPkg() {
const pkgBalenaPath = path.join(
ROOT,
'build-bin',
process.platform === 'win32' ? 'balena.exe' : 'balena',
);
console.log(`Testing standalone package "${pkgBalenaPath}"...`);
// Run `balena version -j`, parse its stdout as JSON, and check that the
// reported Node.js major version matches semver.major(process.version)
let { stdout, stderr } = await execFileAsync(pkgBalenaPath, [
'version',
'-j',
]);
const { filterCliOutputForTests } = await import('../tests/helpers');
const filtered = filterCliOutputForTests({
err: stderr.split(/\r?\n/),
out: stdout.split(/\r?\n/),
});
stdout = filtered.out.join('\n');
stderr = filtered.err.join('\n');
let pkgNodeVersion = '';
let pkgNodeMajorVersion = 0;
try {
const balenaVersions: JsonVersions = JSON.parse(stdout);
pkgNodeVersion = balenaVersions['Node.js'];
pkgNodeMajorVersion = semver.major(pkgNodeVersion);
} catch (err) {
throw new Error(stripIndent`
Error parsing JSON output of "balena version -j": ${err}
Original output: "${stdout}"`);
}
if (semver.major(process.version) !== pkgNodeMajorVersion) {
throw new Error(
`Mismatched major version: built-in pkg Node version="${pkgNodeVersion}" vs process.version="${process.version}"`,
);
}
if (filtered.err.length > 0) {
const err = filtered.err.join('\n');
throw new Error(`"${pkgBalenaPath}": non-empty stderr "${err}"`);
}
console.log('Success! (standalone package test successful)');
}
/**
* Create the zip file for the standalone 'pkg' bundle previously created
* by the buildPkg() function in 'build-bin.ts'.
*/
async function zipPkg() {
const outputFile = standaloneZips[process.platform];
if (!outputFile) {
throw new Error(
`Standalone installer unavailable for platform "${process.platform}"`,
);
}
await fs.mkdirp(path.dirname(outputFile));
await new Promise((resolve, reject) => {
console.log(`Zipping standalone package to "${outputFile}"...`);
const archive = archiver('zip', {
zlib: { level: 7 },
);
});
archive.directory(path.join(ROOT, 'build-bin'), 'balena-cli');
const outputStream = fs.createWriteStream(outputFile);
outputStream.on('close', resolve);
outputStream.on('error', reject);
archive.on('error', reject);
archive.on('warning', console.warn);
archive.pipe(outputStream);
archive.finalize().catch(reject);
});
}
async function signFilesForNotarization() {
console.log('Deleting unneeded zip files...');
await new Promise((resolve, reject) => {
klaw('node_modules/')
.on('data', (item: { path: string; stats: Stats }) => {
if (!item.stats.isFile()) {
return;
}
if (path.basename(item.path).endsWith('.node.bak')) {
console.log('Removing pkg .node.bak file', item.path);
fs.unlinkSync(item.path);
}
if (
path.basename(item.path).endsWith('.zip') &&
path.dirname(item.path).includes('test')
) {
console.log('Removing zip', item.path);
fs.unlinkSync(item.path);
}
})
.on('end', resolve)
.on('error', reject);
});
// Sign all .node files first
console.log('Signing .node files...');
await new Promise((resolve, reject) => {
klaw('node_modules/')
.on('data', async (item: { path: string; stats: Stats }) => {
if (!item.stats.isFile()) {
return;
}
if (path.basename(item.path).endsWith('.node')) {
console.log('running command:', 'codesign', [
'-d',
'-f',
'-s',
'Developer ID Application: Balena Ltd (66H43P8FRG)',
item.path,
]);
await whichSpawn('codesign', [
'-d',
'-f',
'-s',
'Developer ID Application: Balena Ltd (66H43P8FRG)',
item.path,
]);
}
})
.on('end', resolve)
.on('error', reject);
});
console.log('Signing other binaries...');
console.log('running command:', 'codesign', [
'-d',
'-f',
'--options=runtime',
'-s',
'Developer ID Application: Balena Ltd (66H43P8FRG)',
'node_modules/denymount/bin/denymount',
]);
await whichSpawn('codesign', [
'-d',
'-f',
'--options=runtime',
'-s',
'Developer ID Application: Balena Ltd (66H43P8FRG)',
'node_modules/denymount/bin/denymount',
]);
console.log('running command:', 'codesign', [
'-d',
'-f',
'--options=runtime',
'-s',
'Developer ID Application: Balena Ltd (66H43P8FRG)',
'node_modules/macmount/bin/macmount',
]);
await whichSpawn('codesign', [
'-d',
'-f',
'--options=runtime',
'-s',
'Developer ID Application: Balena Ltd (66H43P8FRG)',
'node_modules/macmount/bin/macmount',
]);
}
export async function buildStandaloneZip() {
console.log(`Building standalone zip package for CLI ${version}`);
try {
await buildPkg();
await testPkg();
await zipPkg();
console.log(`Standalone zip package build completed`);
} catch (error) {
console.error(`Error creating or testing standalone zip package`);
throw error;
}
}
async function renameInstallerFiles() {
if (await fs.pathExists(oclifInstallers[process.platform])) {
await fs.rename(
oclifInstallers[process.platform],
renamedOclifInstallers[process.platform],
);
}
}
/**
* If the CSC_LINK and CSC_KEY_PASSWORD env vars are set, digitally sign the
* executable installer using Microsoft SignTool.exe (Sign Tool)
* https://learn.microsoft.com/en-us/dotnet/framework/tools/signtool-exe
*/
async function signWindowsInstaller() {
if (process.env.CSC_LINK && process.env.CSC_KEY_PASSWORD) {
const exeName = renamedOclifInstallers[process.platform];
console.log(`Signing installer "${exeName}"`);
// trust ...
await execFileAsync('signtool.exe', [
'sign',
'-t',
process.env.TIMESTAMP_SERVER || 'http://timestamp.comodoca.com',
'-f',
process.env.CSC_LINK,
'-p',
process.env.CSC_KEY_PASSWORD,
'-d',
`balena-cli ${version}`,
exeName,
]);
// ... but verify
await execFileAsync('signtool.exe', ['verify', '-pa', '-v', exeName]);
} else {
console.log(
'Skipping installer signing step because CSC_* env vars are not set',
);
}
}
/**
* Wait for Apple Installer Notarization to continue
*/
async function notarizeMacInstaller(): Promise<void> {
const appleId =
process.env.XCODE_APP_LOADER_EMAIL || 'accounts+apple@balena.io';
const appBundleId = packageJSON.oclif.macos.identifier || 'io.balena.cli';
const appleIdPassword = process.env.XCODE_APP_LOADER_PASSWORD;
if (appleIdPassword) {
const { notarize } = await import('electron-notarize');
// https://github.com/electron/notarize/blob/main/README.md
await notarize({
appBundleId,
appPath: renamedOclifInstallers.darwin,
appleId,
appleIdPassword,
});
}
}
/**
* Run the `oclif pack:win` or `pack:macos` command (depending on the value
* of process.platform) to generate the native installers (which end up under
* the 'dist' folder). There are some harcoded options such as selecting only
* 64-bit binaries under Windows.
*/
export async function buildOclifInstaller() {
let packOS = '';
let packOpts = ['-r', ROOT];
if (process.platform === 'darwin') {
packOS = 'macos';
} else if (process.platform === 'win32') {
packOS = 'win';
packOpts = packOpts.concat('-t', 'win32-x64');
}
if (packOS) {
console.log(`Building oclif installer for CLI ${version}`);
const packCmd = `pack:${packOS}`;
const dirs = [path.join(ROOT, 'dist', packOS)];
if (packOS === 'win') {
dirs.push(path.join(ROOT, 'tmp', 'win*'));
}
for (const dir of dirs) {
console.log(`rimraf(${dir})`);
await Bluebird.fromCallback((cb) => rimraf(dir, cb));
}
if (process.platform === 'darwin') {
console.log('Signing files for notarization...');
await signFilesForNotarization();
}
console.log('=======================================================');
console.log(`oclif "${packCmd}" "${packOpts.join('" "')}"`);
console.log(`cwd="${process.cwd()}" ROOT="${ROOT}"`);
console.log('=======================================================');
await oclifRun([packCmd].concat(...packOpts));
await renameInstallerFiles();
// The Windows installer is explicitly signed here (oclif doesn't do it).
// The macOS installer is automatically signed by oclif (which runs the
// `pkgbuild` tool), using the certificate name given in package.json
// (`oclif.macos.sign` section).
if (process.platform === 'win32') {
await signWindowsInstaller();
} else if (process.platform === 'darwin') {
console.log('Notarizing package...');
await notarizeMacInstaller(); // Notarize
console.log('Package notarized.');
}
console.log(`oclif installer build completed`);
}
}
/**
* Wrapper around the npm `catch-uncommitted` package in order to run it
* conditionally, only when:
* - A CI env var is set (CI=true), and
* - The OS is not Windows. (`catch-uncommitted` fails on Windows)
*/
export async function catchUncommitted(): Promise<void> {
if (process.env.DEBUG) {
console.error(`[debug] CI=${process.env.CI} platform=${process.platform}`);
}
if (
process.env.CI &&
['true', 'yes', '1'].includes(process.env.CI.toLowerCase()) &&
process.platform !== 'win32'
) {
await whichSpawn('npx', [
'catch-uncommitted',
'--catch-no-git',
'--skip-node-versionbot-changes',
'--ignore-space-at-eol',
]);
}
}
export async function testShrinkwrap(): Promise<void> {
if (process.env.DEBUG) {
console.error(`[debug] platform=${process.platform}`);
}
if (process.platform !== 'win32') {
await whichSpawn(path.resolve(__dirname, 'test-lock-deduplicated.sh'));
}
}

View File

@ -1,220 +0,0 @@
/**
* @license
* Copyright 2019 Balena Ltd.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import * as path from 'path';
import { MarkdownFileParser } from './utils';
/**
* This is the skeleton of CLI documentation/reference web page at:
* https://www.balena.io/docs/reference/cli/
*
* The `getCapitanoDoc` function in this module parses README.md and adds
* some content to this object.
*/
const capitanoDoc = {
title: 'balena CLI Documentation',
introduction: '',
categories: [
{
title: 'API keys',
files: ['build/commands/api-key/generate.js'],
},
{
title: 'Fleet',
files: [
'build/commands/fleets.js',
'build/commands/fleet/index.js',
'build/commands/fleet/create.js',
'build/commands/fleet/purge.js',
'build/commands/fleet/rename.js',
'build/commands/fleet/restart.js',
'build/commands/fleet/rm.js',
],
},
{
title: 'Authentication',
files: [
'build/commands/login.js',
'build/commands/logout.js',
'build/commands/whoami.js',
],
},
{
title: 'Device',
files: [
'build/commands/devices/index.js',
'build/commands/devices/supported.js',
'build/commands/device/index.js',
'build/commands/device/deactivate.js',
'build/commands/device/identify.js',
'build/commands/device/init.js',
'build/commands/device/local-mode.js',
'build/commands/device/move.js',
'build/commands/device/os-update.js',
'build/commands/device/public-url.js',
'build/commands/device/purge.js',
'build/commands/device/reboot.js',
'build/commands/device/register.js',
'build/commands/device/rename.js',
'build/commands/device/restart.js',
'build/commands/device/rm.js',
'build/commands/device/shutdown.js',
],
},
{
title: 'Releases',
files: [
'build/commands/releases.js',
'build/commands/release/index.js',
'build/commands/release/finalize.js',
],
},
{
title: 'Environment Variables',
files: [
'build/commands/envs.js',
'build/commands/env/add.js',
'build/commands/env/rename.js',
'build/commands/env/rm.js',
],
},
{
title: 'Tags',
files: [
'build/commands/tags.js',
'build/commands/tag/rm.js',
'build/commands/tag/set.js',
],
},
{
title: 'Help and Version',
files: ['help', 'build/commands/version.js'],
},
{
title: 'Keys',
files: [
'build/commands/keys.js',
'build/commands/key/index.js',
'build/commands/key/add.js',
'build/commands/key/rm.js',
],
},
{
title: 'Logs',
files: ['build/commands/logs.js'],
},
{
title: 'Network',
files: [
'build/commands/scan.js',
'build/commands/ssh.js',
'build/commands/tunnel.js',
],
},
{
title: 'Notes',
files: ['build/commands/note.js'],
},
{
title: 'OS',
files: [
'build/commands/os/build-config.js',
'build/commands/os/configure.js',
'build/commands/os/versions.js',
'build/commands/os/download.js',
'build/commands/os/initialize.js',
],
},
{
title: 'Config',
files: [
'build/commands/config/generate.js',
'build/commands/config/inject.js',
'build/commands/config/read.js',
'build/commands/config/reconfigure.js',
'build/commands/config/write.js',
],
},
{
title: 'Preload',
files: ['build/commands/preload.js'],
},
{
title: 'Push',
files: ['build/commands/push.js'],
},
{
title: 'Settings',
files: ['build/commands/settings.js'],
},
{
title: 'Local',
files: [
'build/commands/local/configure.js',
'build/commands/local/flash.js',
],
},
{
title: 'Deploy',
files: ['build/commands/build.js', 'build/commands/deploy.js'],
},
{
title: 'Platform',
files: ['build/commands/join.js', 'build/commands/leave.js'],
},
{
title: 'Utilities',
files: ['build/commands/util/available-drives.js'],
},
{
title: 'Support',
files: ['build/commands/support.js'],
},
],
};
/**
* Modify and return the `capitanoDoc` object above in order to render the
* CLI documentation/reference web page at:
* https://www.balena.io/docs/reference/cli/
*
* This function parses the README.md file to extract relevant sections
* for the documentation web page.
*/
export async function getCapitanoDoc(): Promise<typeof capitanoDoc> {
const readmePath = path.join(__dirname, '..', '..', 'README.md');
const mdParser = new MarkdownFileParser(readmePath);
const sections: string[] = await Promise.all([
mdParser.getSectionOfTitle('About').then((sectionLines: string) => {
// delete the title of the 'About' section for the web page
const match = /^(#+)\s+.+?\n\s*([^]*)/.exec(sectionLines);
if (!match || match.length < 3) {
throw new Error(`Error parsing section title`);
}
// match[1] has the title, match[2] has the rest
return match && match[2];
}),
mdParser.getSectionOfTitle('Installation'),
mdParser.getSectionOfTitle('Choosing a shell (command prompt/terminal)'),
mdParser.getSectionOfTitle('Logging in'),
mdParser.getSectionOfTitle('Proxy support'),
mdParser.getSectionOfTitle('Support, FAQ and troubleshooting'),
mdParser.getSectionOfTitle('Deprecation policy'),
]);
capitanoDoc.introduction = sections.join('\n');
return capitanoDoc;
}

View File

@ -1,22 +1,4 @@
/**
* @license
* Copyright 2019 Balena Ltd.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import { Command as OclifCommandClass } from '@oclif/command';
type OclifCommand = typeof OclifCommandClass;
import { CommandDefinition } from 'capitano';
export interface Document {
title: string;
@ -26,7 +8,7 @@ export interface Document {
export interface Category {
title: string;
commands: OclifCommand[];
commands: CommandDefinition[];
}
export { OclifCommand };
export { CommandDefinition as Command };

View File

@ -1,109 +1,34 @@
/**
* @license
* Copyright 2019 Balena Ltd.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import capitanodoc = require('../../capitanodoc');
import * as _ from 'lodash';
import * as path from 'path';
import { getCapitanoDoc } from './capitanodoc';
import { Category, Document, OclifCommand } from './doc-types';
import * as markdown from './markdown';
import { stripIndent } from '../../lib/utils/lazy';
import { Document, Category } from './doc-types';
/**
* Generates the markdown document (as a string) for the CLI documentation
* page on the web: https://www.balena.io/docs/reference/cli/
*/
export async function renderMarkdown(): Promise<string> {
const capitanodoc = await getCapitanoDoc();
const result: Document = {
title: capitanodoc.title,
introduction: capitanodoc.introduction,
categories: [],
};
const result = <Document>{};
result.title = capitanodoc.title;
result.introduction = capitanodoc.introduction;
result.categories = [];
for (const commandCategory of capitanodoc.categories) {
const category: Category = {
title: commandCategory.title,
commands: [],
};
for (let commandCategory of capitanodoc.categories) {
const category = <Category>{};
category.title = commandCategory.title;
category.commands = [];
for (const jsFilename of commandCategory.files) {
category.commands.push(...importOclifCommands(jsFilename));
for (let file of commandCategory.files) {
// tslint:disable-next-line:no-var-requires
const actions: any = require(path.join(process.cwd(), file));
if (actions.signature) {
category.commands.push(_.omit(actions, 'action'));
} else {
for (let actionName of Object.keys(actions)) {
const actionCommand = actions[actionName];
category.commands.push(_.omit(actionCommand, 'action'));
}
}
result.categories.push(category);
}
return markdown.render(result);
result.categories.push(category);
}
// Help is now managed via a plugin
// This fake command allows capitanodoc to include help in docs
class FakeHelpCommand {
description = stripIndent`
List balena commands, or get detailed help for a specific command.
List balena commands, or get detailed help for a specific command.
`;
examples = [
'$ balena help',
'$ balena help login',
'$ balena help os download',
];
args = [
{
name: 'command',
description: 'command to show help for',
},
];
usage = 'help [command]';
flags = {
verbose: {
description: 'show additional commands',
char: '-v',
},
};
}
function importOclifCommands(jsFilename: string): OclifCommand[] {
// TODO: Currently oclif commands with no `usage` overridden will cause
// an error when parsed. This should be improved so that `usage` does not have
// to be overridden if not necessary.
const command: OclifCommand =
jsFilename === 'help'
? (new FakeHelpCommand() as unknown as OclifCommand)
: (require(path.join(process.cwd(), jsFilename)).default as OclifCommand);
return [command];
}
/**
* Print the CLI docs markdown to stdout.
* See package.json for how the output is redirected to a file.
*/
async function printMarkdown() {
try {
console.log(await renderMarkdown());
} catch (error) {
console.error(error);
process.exitCode = 1;
}
}
// tslint:disable-next-line:no-floating-promises
printMarkdown();
console.log(markdown.render(result));

View File

@ -1,128 +1,76 @@
/**
* @license
* Copyright 2019 Balena Ltd.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import { flagUsages } from '@oclif/parser';
import * as ent from 'ent';
import * as _ from 'lodash';
import * as ent from 'ent';
import * as utils from './utils';
import { Document, Category, Command } from './doc-types';
import { getManualSortCompareFunction } from '../../lib/utils/helpers';
import { capitanoizeOclifUsage } from '../../lib/utils/oclif-utils';
import { Category, Document, OclifCommand } from './doc-types';
export function renderCommand(command: Command) {
let result = `## ${ent.encode(command.signature)}\n\n${command.help}\n`;
function renderOclifCommand(command: OclifCommand): string[] {
const result = [`## ${ent.encode(command.usage || '')}`];
const description = (command.description || '')
.split('\n')
.slice(1) // remove the first line, which oclif uses as help header
.join('\n')
.trim();
result.push(description);
if (!_.isEmpty(command.options)) {
result += '\n### Options';
if (!_.isEmpty(command.examples)) {
result.push('Examples:', command.examples!.map((v) => `\t${v}`).join('\n'));
}
if (!_.isEmpty(command.args)) {
result.push('### Arguments');
for (const arg of command.args!) {
result.push(`#### ${arg.name.toUpperCase()}`, arg.description || '');
for (let option of command.options!) {
result += `\n\n#### ${utils.parseSignature(option)}\n\n${
option.description
}`;
}
result += '\n';
}
if (!_.isEmpty(command.flags)) {
result.push('### Options');
for (const [name, flag] of Object.entries(command.flags!)) {
if (name === 'help') {
continue;
}
flag.name = name;
const flagUsage = flagUsages([flag])
.map(([usage, _description]) => usage)
.join()
.trim();
result.push(`#### ${flagUsage}`);
result.push(flag.description || '');
}
}
return result;
}
function renderCategory(category: Category): string[] {
const result = [`# ${category.title}`];
for (const command of category.commands) {
result.push(...renderOclifCommand(command));
export function renderCategory(category: Category) {
let result = `# ${category.title}\n`;
for (let command of category.commands) {
result += `\n${renderCommand(command)}`;
}
return result;
}
function getAnchor(cmdSignature: string): string {
return `#${_.trim(cmdSignature.replace(/\W+/g, '-'), '-').toLowerCase()}`;
function getAnchor(command: Command) {
return (
'#' +
command.signature
.replace(/\s/g, '-')
.replace(/</g, '-')
.replace(/>/g, '-')
.replace(/\[/g, '-')
.replace(/\]/g, '-')
.replace(/-+/g, '-')
.replace(/\.\.\./g, '')
.replace(/\|/g, '')
.toLowerCase()
);
}
function renderToc(categories: Category[]): string[] {
const result = [`# CLI Command Reference`];
export function renderToc(categories: Category[]) {
let result = `# Table of contents\n`;
for (const category of categories) {
result.push(`- ${category.title}`);
result.push(
category.commands
.map((command) => {
const signature = capitanoizeOclifUsage(command.usage);
return `\t- [${ent.encode(signature)}](${getAnchor(signature)})`;
})
.join('\n'),
);
}
return result;
}
for (let category of categories) {
result += `\n- ${category.title}\n\n`;
const manualCategorySorting: { [category: string]: string[] } = {
'Environment Variables': ['envs', 'env rm', 'env add', 'env rename'],
OS: [
'os versions',
'os download',
'os build config',
'os configure',
'os initialize',
],
};
function sortCommands(doc: Document): void {
for (const category of doc.categories) {
if (category.title in manualCategorySorting) {
category.commands = category.commands.sort(
getManualSortCompareFunction<OclifCommand, string>(
manualCategorySorting[category.title],
(cmd: OclifCommand, x: string) =>
(cmd.usage || '').toString().replace(/\W+/g, ' ').includes(x),
),
);
for (let command of category.commands) {
result += `\t- [${ent.encode(command.signature)}](${getAnchor(
command,
)})\n`;
}
}
return result;
}
export function render(doc: Document) {
sortCommands(doc);
const result = [
`# ${doc.title}`,
doc.introduction,
...renderToc(doc.categories),
];
for (const category of doc.categories) {
result.push(...renderCategory(category));
let result = `# ${doc.title}\n\n${doc.introduction}\n\n${renderToc(
doc.categories,
)}`;
for (let category of doc.categories) {
result += `\n${renderCategory(category)}`;
}
return result.join('\n\n');
return result;
}

View File

@ -1,24 +1,6 @@
/**
* @license
* Copyright 2019 Balena Ltd.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import type { OptionDefinition } from 'capitano';
import { OptionDefinition } from 'capitano';
import * as _ from 'lodash';
import * as ent from 'ent';
import * as fs from 'fs';
import * as readline from 'readline';
export function getOptionPrefix(signature: string) {
if (signature.length > 1) {
@ -32,14 +14,14 @@ export function getOptionSignature(signature: string) {
return `${getOptionPrefix(signature)}${signature}`;
}
export function parseCapitanoOption(option: OptionDefinition): string {
export function parseSignature(option: OptionDefinition) {
let result = getOptionSignature(option.signature);
if (Array.isArray(option.alias)) {
for (const alias of option.alias) {
if (_.isArray(option.alias)) {
for (let alias of option.alias) {
result += `, ${getOptionSignature(alias)}`;
}
} else if (typeof option.alias === 'string') {
} else if (_.isString(option.alias)) {
result += `, ${getOptionSignature(option.alias)}`;
}
@ -49,88 +31,3 @@ export function parseCapitanoOption(option: OptionDefinition): string {
return ent.encode(result);
}
export class MarkdownFileParser {
constructor(public mdFilePath: string) {}
/**
* Extract the lines of a markdown document section with the given title.
* For example, consider this sample markdown document:
* ```
* # balena CLI
*
* ## Introduction
* Lorem ipsum dolor sit amet, consectetur adipiscing elit,
*
* ## Getting Started
* sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
*
* ### Prerequisites
* - Foo
* - Bar
*
* ## Support
* Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris.
* ```
*
* Calling getSectionOfTitle('Getting Started') for the markdown doc above
* returns everything from line '## Getting Started' (included) to line
* '## Support' (excluded). This method counts the number of '#' characters
* to determine that subsections should be included as part of the parent
* section.
*
* @param title The section title without '#' chars, eg. 'Getting Started'
*/
public async getSectionOfTitle(
title: string,
includeSubsections = true,
): Promise<string> {
let foundSectionLines: string[];
let foundSectionLevel = 0;
const rl = readline.createInterface({
input: fs.createReadStream(this.mdFilePath),
crlfDelay: Infinity,
});
rl.on('line', (line) => {
// try to match a line like "## Getting Started", where the number
// of '#' characters is the sectionLevel ('##' -> 2), and the
// sectionTitle is "Getting Started"
const match = /^(#+)\s+(.+)/.exec(line);
if (match) {
const sectionLevel = match[1].length;
const sectionTitle = match[2];
// If the target section had already been found: append a line, or end it
if (foundSectionLines) {
if (!includeSubsections || sectionLevel <= foundSectionLevel) {
// end previously found section
rl.close();
}
} else if (sectionTitle === title) {
// found the target section
foundSectionLevel = sectionLevel;
foundSectionLines = [];
}
}
if (foundSectionLines) {
foundSectionLines.push(line);
}
});
return await new Promise((resolve, reject) => {
rl.on('close', () => {
if (foundSectionLines) {
resolve(foundSectionLines.join('\n'));
} else {
reject(
new Error(
`Markdown section not found: title="${title}" file="${this.mdFilePath}"`,
),
);
}
});
});
}
}

View File

@ -1,86 +0,0 @@
/**
* @license
* Copyright 2020 Balena Ltd.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
// tslint:disable-next-line:import-blacklist
import { stripIndent } from 'common-tags';
import * as _ from 'lodash';
import { promises as fs } from 'fs';
import * as path from 'path';
import { simpleGit } from 'simple-git';
const ROOT = path.normalize(path.join(__dirname, '..'));
/**
* Compare the timestamp of balena-cli.md with the timestamp of staged files,
* issuing an error if balena-cli.md is older.
* If balena-cli.md does not require updating and the developer cannot run
* `npm run build` on their laptop, the error message suggests a workaround
* using `touch`.
*/
async function checkBuildTimestamps() {
const git = simpleGit(ROOT);
const docFile = path.join(ROOT, 'docs', 'balena-cli.md');
const [docStat, gitStatus] = await Promise.all([
fs.stat(docFile),
git.status(),
]);
const stagedFiles = _.uniq([
...gitStatus.created,
...gitStatus.staged,
...gitStatus.renamed.map((o) => o.to),
])
// select only staged files that start with lib/ or typings/
.filter((f) => f.match(/^(lib|typings)[/\\]/))
.map((f) => path.join(ROOT, f));
const fStats = await Promise.all(stagedFiles.map((f) => fs.stat(f)));
fStats.forEach((fStat, index) => {
if (fStat.mtimeMs > docStat.mtimeMs) {
const fPath = stagedFiles[index];
throw new Error(stripIndent`
--------------------------------------------------------------------------------
ERROR: at least one staged file: "${fPath}"
has a more recent modification timestamp than the documentation file:
"${docFile}"
This probably means that \`npm run build\` or \`npm test\` have not been executed,
and this error can be fixed by doing so. Running \`npm run build\` or \`npm test\`
before commiting is required in order to update the CLI markdown documentation
(in case any command-line options were updated, added or removed) and also to
catch Typescript type check errors sooner and reduce overall waiting time, given
that the CI build/tests are currently rather lengthy.
If you need/wish to bypass this check without running \`npm run build\`, run:
npx touch -am "${docFile}"
and then try again.
--------------------------------------------------------------------------------
`);
}
});
}
async function run() {
try {
await checkBuildTimestamps();
} catch (err) {
console.error(err.message);
process.exitCode = 1;
}
}
// tslint:disable-next-line:no-floating-promises
run();

View File

@ -1,71 +0,0 @@
#!/usr/bin/env node
'use strict';
/**
* Check that semver v1 is greater than or equal to semver v2.
*
* We don't `require('semver')` to allow this script to be run as a npm
* 'preinstall' hook, at which point no dependencies have been installed.
*
* @param {string} version
*/
function parseSemver(version) {
const match = /v?(\d+)\.(\d+).(\d+)/.exec(version);
if (match == null) {
throw new Error(`Invalid semver version: ${version}`);
}
const [, major, minor, patch] = match;
return [parseInt(major, 10), parseInt(minor, 10), parseInt(patch, 10)];
}
/**
* @param {string} v1
* @param {string} v2
*/
function semverGte(v1, v2) {
let v1Array = parseSemver(v1);
let v2Array = parseSemver(v2);
for (let i = 0; i < 3; i++) {
if (v1Array[i] < v2Array[i]) {
return false;
} else if (v1Array[i] > v2Array[i]) {
return true;
}
}
return true;
}
function checkNpmVersion() {
const execSync = require('child_process').execSync;
const npmVersion = execSync('npm --version').toString().trim();
const requiredVersion = '6.9.0';
if (!semverGte(npmVersion, requiredVersion)) {
// In case you take issue with the error message below:
// "At this point, however, your 'npm-shrinkwrap.json' file has
// already been damaged"
// ... and think: "why not add the check to the 'preinstall' hook?",
// the reason is that it would unnecessarily prevent end users from
// using npm v6.4.1 that ships with Node 8. (It is OK for the
// shrinkwrap file to get damaged if it is not going to be reused.)
throw new Error(`\
-----------------------------------------------------------------------------
Error: npm version '${npmVersion}' detected. Please upgrade to npm v${requiredVersion} or later
because of a bug that causes the 'npm-shrinkwrap.json' file to be damaged.
At this point, however, your 'npm-shrinkwrap.json' file has already been
damaged. Please revert it to the master branch state with a command such as:
"git checkout master -- npm-shrinkwrap.json"
Then re-run "npm install" using npm version ${requiredVersion} or later.
-----------------------------------------------------------------------------`);
}
}
function main() {
try {
checkNpmVersion();
} catch (e) {
console.error(e.message || e);
process.exitCode = 1;
}
}
main();

38
automation/custom-types.d.ts vendored Normal file
View File

@ -0,0 +1,38 @@
declare module 'pkg' {
export function exec(args: string[]): Promise<void>;
}
declare module 'filehound' {
export function create(): FileHound;
export interface FileHound {
paths(paths: string[]): FileHound;
paths(...paths: string[]): FileHound;
ext(extensions: string[]): FileHound;
ext(...extensions: string[]): FileHound;
find(): Promise<string[]>;
}
}
declare module 'publish-release' {
interface PublishOptions {
token: string;
owner: string;
repo: string;
tag: string;
name: string;
reuseRelease?: boolean;
assets: string[];
}
interface Release {
html_url: string;
}
let publishRelease: (
args: PublishOptions,
callback: (e: Error, release: Release) => void,
) => void;
export = publishRelease;
}

View File

@ -1,257 +1,67 @@
/**
* @license
* Copyright 2019 Balena Ltd.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import * as Promise from 'bluebird';
import * as path from 'path';
import * as os from 'os';
import * as fs from 'fs-extra';
import * as mkdirp from 'mkdirp';
import * as publishRelease from 'publish-release';
import * as archiver from 'archiver';
import * as packageJSON from '../package.json';
import * as Bluebird from 'bluebird';
import * as _ from 'lodash';
import * as semver from 'semver';
import { finalReleaseAssets, version } from './build-bin';
const publishReleaseAsync = Promise.promisify(publishRelease);
const mkdirpAsync = Promise.promisify<string | null, string>(mkdirp);
const { GITHUB_TOKEN } = process.env;
const ROOT = path.join(__dirname, '..');
/**
* Create or update a release in GitHub's releases page, uploading the
* installer files (standalone zip + native oclif installers).
*/
export async function createGitHubRelease() {
console.log(`Publishing release ${version} to GitHub`);
const publishRelease = await import('publish-release');
const ghRelease = await Bluebird.fromCallback(
publishRelease.bind(null, {
token: GITHUB_TOKEN || '',
owner: 'balena-io',
repo: 'balena-cli',
const version = 'v' + packageJSON.version;
const outputFile = path.join(
ROOT,
'build-zip',
`resin-cli-${version}-${os.platform()}-${os.arch()}.zip`,
);
mkdirpAsync(path.dirname(outputFile))
.then(
() =>
new Promise((resolve, reject) => {
console.log('Zipping build...');
let archive = archiver('zip', {
zlib: { level: 7 },
});
archive.directory(path.join(ROOT, 'build-bin'), 'resin-cli');
let outputStream = fs.createWriteStream(outputFile);
outputStream.on('close', resolve);
outputStream.on('error', reject);
archive.on('error', reject);
archive.on('warning', console.warn);
archive.pipe(outputStream);
archive.finalize();
}),
)
.then(() => {
console.log('Build zipped');
console.log('Publishing build...');
return publishReleaseAsync({
token: <string>GITHUB_TOKEN,
owner: 'resin-io',
repo: 'resin-cli',
tag: version,
name: `balena-CLI ${version}`,
name: `Resin-CLI ${version}`,
reuseRelease: true,
assets: finalReleaseAssets[process.platform],
}),
);
console.log(`Release ${version} successful: ${ghRelease.html_url}`);
}
/**
* Top-level function to create a CLI release in GitHub's releases page:
* call zipStandaloneInstaller(), rename the files as we'd like them to
* display on the releases page, and call createGitHubRelease() to upload
* the files.
*/
export async function release() {
try {
await createGitHubRelease();
} catch (err) {
throw new Error(`Error creating GitHub release:\n${err}`);
}
}
/** Return a cached Octokit instance, creating a new one as needed. */
const getOctokit = _.once(function () {
const Octokit = (
require('@octokit/rest') as typeof import('@octokit/rest')
).Octokit.plugin(
(
require('@octokit/plugin-throttling') as typeof import('@octokit/plugin-throttling')
).throttling,
);
return new Octokit({
auth: GITHUB_TOKEN,
throttle: {
onRateLimit: (retryAfter: number, options: any) => {
console.warn(
`Request quota exhausted for request ${options.method} ${options.url}`,
);
// retries 3 times
if (options.request.retryCount < 3) {
console.log(`Retrying after ${retryAfter} seconds!`);
return true;
}
},
onAbuseLimit: (_retryAfter: number, options: any) => {
// does not retry, only logs a warning
console.warn(
`Abuse detected for request ${options.method} ${options.url}`,
);
},
},
assets: [outputFile],
});
})
.then(release => {
console.log(`Release ${version} successful: ${release.html_url}`);
})
.catch(err => {
console.error('Release failed');
console.error(err);
process.exit(1);
});
});
/**
* Extract pagination information (current page, total pages, ordinal number)
* from the 'link' response header (example below), using the parse-link-header
* npm package:
* "link": "<https://api.github.com/repositories/187370853/releases?per_page=2&page=2>; rel=\"next\",
* <https://api.github.com/repositories/187370853/releases?per_page=2&page=3>; rel=\"last\""
*
* @param response Octokit response object (including response.headers.link)
* @param perPageDefault Default per_page pagination value if missing in URL
* @return Object where 'page' is the current page number (1-based),
* 'pages' is the total number of pages, and 'ordinal' is the ordinal number
* (3rd, 4th, 5th...) of the first item in the current page.
*/
function getPageNumbers(
response: any,
perPageDefault: number,
): { page: number; pages: number; ordinal: number } {
const res = { page: 1, pages: 1, ordinal: 1 };
if (!response.headers.link) {
return res;
}
const parse =
require('parse-link-header') as typeof import('parse-link-header');
const parsed = parse(response.headers.link);
if (parsed == null) {
throw new Error(`Failed to parse link header: '${response.headers.link}'`);
}
let perPage = perPageDefault;
if (parsed.next) {
if (parsed.next.per_page) {
perPage = parseInt(parsed.next.per_page, 10);
}
res.page = parseInt(parsed.next.page, 10) - 1;
res.pages = parseInt(parsed.last.page, 10);
} else {
if (parsed.prev.per_page) {
perPage = parseInt(parsed.prev.per_page, 10);
}
res.page = res.pages = parseInt(parsed.prev.page, 10) + 1;
}
res.ordinal = (res.page - 1) * perPage + 1;
return res;
}
/**
* Iterate over every GitHub release in the given owner/repo, check whether
* its tag_name matches against the affectedVersions semver spec, and if so
* replace its release description (body) with the given newDescription value.
* @param owner GitHub repo owner, e.g. 'balena-io' or 'pdcastro'
* @param repo GitHub repo, e.g. 'balena-cli'
* @param affectedVersions Semver spec, e.g. '2.6.1 - 7.10.9 || 8.0.0'
* @param newDescription New release description (body)
* @param editID Short string present in newDescription, e.g. '[AA101]', that
* can be searched to determine whether that release has already been updated.
*/
async function updateGitHubReleaseDescriptions(
owner: string,
repo: string,
affectedVersions: string,
newDescription: string,
editID: string,
) {
const perPage = 30;
const octokit = getOctokit();
const options = await octokit.repos.listReleases.endpoint.merge({
owner,
repo,
per_page: perPage,
});
let errCount = 0;
type Release =
import('@octokit/rest').RestEndpointMethodTypes['repos']['listReleases']['response']['data'][0];
for await (const response of octokit.paginate.iterator<Release>(options)) {
const {
page: thisPage,
pages: totalPages,
ordinal,
} = getPageNumbers(response, perPage);
let i = 0;
for (const cliRelease of response.data) {
const prefix = `[#${ordinal + i++} pg ${thisPage}/${totalPages}]`;
if (!cliRelease.id) {
console.error(
`${prefix} Error: missing release ID (errCount=${++errCount})`,
);
continue;
}
const skipMsg = `${prefix} skipping release "${cliRelease.tag_name}" (${cliRelease.id})`;
if (cliRelease.draft === true) {
console.info(`${skipMsg}: draft release`);
continue;
} else if (cliRelease.body && cliRelease.body.includes(editID)) {
console.info(`${skipMsg}: already updated`);
continue;
} else if (!semver.satisfies(cliRelease.tag_name, affectedVersions)) {
console.info(`${skipMsg}: outside version range`);
continue;
} else {
const updatedRelease = {
owner,
repo,
release_id: cliRelease.id,
body: newDescription,
};
let oldBodyPreview = cliRelease.body;
if (oldBodyPreview) {
oldBodyPreview = oldBodyPreview.replace(/\s+/g, ' ').trim();
if (oldBodyPreview.length > 12) {
oldBodyPreview = oldBodyPreview.substring(0, 9) + '...';
}
}
console.info(
`${prefix} updating release "${cliRelease.tag_name}" (${cliRelease.id}) old body="${oldBodyPreview}"`,
);
try {
await octokit.repos.updateRelease(updatedRelease);
} catch (err) {
console.error(
`${skipMsg}: Error: ${err.message} (count=${++errCount})`,
);
continue;
}
}
}
}
}
/**
* Add a warning description to CLI releases affected by a mixpanel tracking
* security issue (#1359). This function can be executed "manually" with the
* following command line:
*
* npx ts-node --type-check -P automation/tsconfig.json automation/run.ts fix1359
*/
export async function updateDescriptionOfReleasesAffectedByIssue1359() {
// Run only on Linux/Node10, instead of all platform/Node combinations.
// (It could have been any other platform, as long as it only runs once.)
if (process.platform !== 'linux' || semver.major(process.version) !== 10) {
return;
}
const owner = 'balena-io';
const repo = 'balena-cli';
const affectedVersions =
'2.6.1 - 7.10.9 || 8.0.0 - 8.1.0 || 9.0.0 - 9.15.6 || 10.0.0 - 10.17.5 || 11.0.0 - 11.7.2';
const editID = '[AA100]';
let newDescription = `
Please note: the "login" command in this release is affected by a
security issue fixed in versions
[7.10.10](https://github.com/balena-io/balena-cli/releases/tag/v7.10.10),
[8.1.1](https://github.com/balena-io/balena-cli/releases/tag/v8.1.1),
[9.15.7](https://github.com/balena-io/balena-cli/releases/tag/v9.15.7),
[10.17.6](https://github.com/balena-io/balena-cli/releases/tag/v10.17.6),
[11.7.3](https://github.com/balena-io/balena-cli/releases/tag/v11.7.3)
and later. If you need to use this version, avoid passing your password,
keys or tokens as command-line arguments. ${editID}`;
// remove line breaks and collapse white space
newDescription = newDescription.replace(/\s+/g, ' ').trim();
await updateGitHubReleaseDescriptions(
owner,
repo,
affectedVersions,
newDescription,
editID,
);
}

View File

@ -1,107 +0,0 @@
/**
* @license
* Copyright 2019 Balena Ltd.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import * as _ from 'lodash';
import {
buildOclifInstaller,
buildStandaloneZip,
catchUncommitted,
testShrinkwrap,
} from './build-bin';
import {
release,
updateDescriptionOfReleasesAffectedByIssue1359,
} from './deploy-bin';
// DEBUG set to falsy for negative values else is truthy
process.env.DEBUG = ['0', 'no', 'false', '', undefined].includes(
process.env.DEBUG?.toLowerCase(),
)
? ''
: '1';
/**
* Trivial command-line parser. Check whether the command-line argument is one
* of the following strings, then call the appropriate functions:
* 'build:installer' (to build a native oclif installer)
* 'build:standalone' (to build a standalone pkg package)
* 'release' (to create/update a GitHub release)
*
* @param args Arguments to parse (default is process.argv.slice(2))
*/
async function parse(args?: string[]) {
args = args || process.argv.slice(2);
console.error(`[debug] automation/run.ts process.argv=[${process.argv}]`);
console.error(`[debug] automation/run.ts args=[${args}]`);
if (_.isEmpty(args)) {
throw new Error('missing command-line arguments');
}
const commands: { [cmd: string]: () => void | Promise<void> } = {
'build:installer': buildOclifInstaller,
'build:standalone': buildStandaloneZip,
'catch-uncommitted': catchUncommitted,
'test-shrinkwrap': testShrinkwrap,
fix1359: updateDescriptionOfReleasesAffectedByIssue1359,
release,
};
for (const arg of args) {
if (!commands.hasOwnProperty(arg)) {
throw new Error(`command unknown: ${arg}`);
}
}
// The BUILD_TMP env var is used as an alternative location for oclif
// (patched) to copy/extract the CLI files, run npm install and then
// create the NSIS executable installer for Windows. This was necessary
// to avoid issues with a 260-char limit on Windows paths (possibly a
// limitation of some library used by NSIS), as the "current working dir"
// provided by balena CI is a rather long path to start with.
if (process.platform === 'win32' && !process.env.BUILD_TMP) {
const randID = (await import('crypto'))
.randomBytes(6)
.toString('base64')
.replace(/\+/g, '-')
.replace(/\//g, '_'); // base64url (RFC 4648)
process.env.BUILD_TMP = `C:\\tmp\\${randID}`;
}
for (const arg of args) {
try {
const cmdFunc = commands[arg];
await cmdFunc();
} catch (err) {
if (typeof err === 'object') {
err.message = `"${arg}": ${err.message}`;
}
throw err;
}
}
}
/** See jsdoc for parse() function above */
export async function run(args?: string[]) {
try {
await parse(args);
} catch (e) {
console.error(e.message ? `Error: ${e.message}` : e);
process.exitCode = 1;
}
}
// tslint:disable-next-line:no-floating-promises
run();

View File

@ -1,20 +0,0 @@
#!/bin/bash
set -e
cp npm-shrinkwrap.json npm-shrinkwrap.json.old
npm i
npm dedupe
npm i
if ! diff -q npm-shrinkwrap.json npm-shrinkwrap.json.old > /dev/null; then
rm npm-shrinkwrap.json.old
echo "** npm-shrinkwrap.json was not deduplicated or not fully committed - FAIL **";
echo "** This can usually be fixed with: **";
echo "** git checkout master -- npm-shrinkwrap.json **";
echo "** rm -rf node_modules **";
echo "** npm install && npm dedupe && npm install **";
exit 1;
fi
rm npm-shrinkwrap.json.old

16
automation/tsconfig.json Normal file
View File

@ -0,0 +1,16 @@
{
"compilerOptions": {
"module": "commonjs",
"target": "es2015",
"strict": true,
"noUnusedLocals": true,
"noUnusedParameters": true,
"preserveConstEnums": true,
"removeComments": true,
"sourceMap": true
},
"include": [
"./**/*.ts",
"../typings/*.d.ts"
]
}

View File

@ -1,140 +0,0 @@
import { exec } from 'child_process';
import * as semver from 'semver';
const changeTypes = ['major', 'minor', 'patch'] as const;
const validateChangeType = (maybeChangeType: string = 'minor') => {
maybeChangeType = maybeChangeType.toLowerCase();
switch (maybeChangeType) {
case 'patch':
case 'minor':
case 'major':
return maybeChangeType;
default:
throw new Error(`Invalid change type: '${maybeChangeType}'`);
}
};
const compareSemverChangeType = (oldVersion: string, newVersion: string) => {
const oldSemver = semver.parse(oldVersion)!;
const newSemver = semver.parse(newVersion)!;
for (const changeType of changeTypes) {
if (oldSemver[changeType] !== newSemver[changeType]) {
return changeType;
}
}
};
const run = async (cmd: string) => {
console.info(`Running '${cmd}'`);
return new Promise<{ stdout: string; stderr: string }>((resolve, reject) => {
const p = exec(cmd, { encoding: 'utf8' }, (err, stdout, stderr) => {
if (err) {
reject(err);
return;
}
resolve({ stdout, stderr });
});
p.stdout?.pipe(process.stdout);
p.stderr?.pipe(process.stderr);
});
};
const getVersion = async (module: string): Promise<string> => {
const { stdout } = await run(`npm ls --json --depth 0 ${module}`);
return JSON.parse(stdout).dependencies[module].version;
};
interface Upstream {
repo: string;
url: string;
module?: string;
}
const getUpstreams = async () => {
const fs = await import('fs');
const repoYaml = fs.readFileSync(__dirname + '/../repo.yml', 'utf8');
const yaml = await import('js-yaml');
const { upstream } = yaml.load(repoYaml) as {
upstream: Upstream[];
};
return upstream;
};
const getUsage = (upstreams: Upstream[], upstreamName: string) => `
Usage: npm run update ${upstreamName} $version [$changeType=minor]
Upstream names: ${upstreams.map(({ repo }) => repo).join(', ')}
`;
async function $main() {
const upstreams = await getUpstreams();
if (process.argv.length < 3) {
throw new Error(getUsage(upstreams, '$upstreamName'));
}
const upstreamName = process.argv[2];
const upstream = upstreams.find((v) => v.repo === upstreamName);
if (!upstream) {
throw new Error(
`Invalid upstream name '${upstreamName}', valid options: ${upstreams
.map(({ repo }) => repo)
.join(', ')}`,
);
}
if (process.argv.length < 4) {
throw new Error(getUsage(upstreams, upstreamName));
}
const packageName = upstream.module || upstream.repo;
const oldVersion = await getVersion(packageName);
await run(`npm install ${packageName}@${process.argv[3]}`);
const newVersion = await getVersion(packageName);
if (newVersion === oldVersion) {
throw new Error(`Already on version '${newVersion}'`);
}
console.log(`Updated ${upstreamName} from ${oldVersion} to ${newVersion}`);
const semverChangeType = compareSemverChangeType(oldVersion, newVersion);
const changeType = process.argv[4]
? // if the caller specified a change type, use that one
validateChangeType(process.argv[4])
: // use the same change type as in the dependency, but avoid major bumps
semverChangeType && semverChangeType !== 'major'
? semverChangeType
: 'minor';
console.log(`Using Change-type: ${changeType}`);
let { stdout: currentBranch } = await run('git rev-parse --abbrev-ref HEAD');
currentBranch = currentBranch.trim();
console.log(`Currenty on branch: '${currentBranch}'`);
if (currentBranch === 'master') {
await run(`git checkout -b "update-${upstreamName}-${newVersion}"`);
}
await run(`git add package.json npm-shrinkwrap.json`);
await run(
`git commit --message "Update ${upstreamName} to ${newVersion}" --message "Update ${upstreamName} from ${oldVersion} to ${newVersion}" --message "Change-type: ${changeType}"`,
);
}
async function main() {
try {
await $main();
} catch (e) {
console.error(e);
process.exitCode = 1;
}
}
// tslint:disable-next-line:no-floating-promises
main();

View File

@ -1,148 +0,0 @@
/**
* @license
* Copyright 2019-2020 Balena Ltd.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import { spawn } from 'child_process';
import * as _ from 'lodash';
import * as path from 'path';
export const ROOT = path.join(__dirname, '..');
/** Tap and buffer this process' stdout and stderr */
export class StdOutTap {
public stdoutBuf: string[] = [];
public stderrBuf: string[] = [];
public allBuf: string[] = []; // both stdout and stderr
protected origStdoutWrite: typeof process.stdout.write;
protected origStderrWrite: typeof process.stdout.write;
constructor(protected printDots = false) {}
tap() {
this.origStdoutWrite = process.stdout.write;
this.origStderrWrite = process.stderr.write;
process.stdout.write = (chunk: string, ...args: any[]): boolean => {
this.stdoutBuf.push(chunk);
this.allBuf.push(chunk);
const str = this.printDots ? '.' : chunk;
return this.origStdoutWrite.call(process.stdout, str, ...args);
};
process.stderr.write = (chunk: string, ...args: any[]): boolean => {
this.stderrBuf.push(chunk);
this.allBuf.push(chunk);
const str = this.printDots ? '.' : chunk;
return this.origStderrWrite.call(process.stderr, str, ...args);
};
}
untap() {
process.stdout.write = this.origStdoutWrite;
process.stderr.write = this.origStderrWrite;
if (this.printDots) {
console.error('');
}
}
}
/**
* Diff strings by line, using the 'diff' npm package:
* https://www.npmjs.com/package/diff
*/
export function diffLines(str1: string, str2: string): string {
const { diffTrimmedLines } = require('diff');
const diffObjs = diffTrimmedLines(str1, str2);
const prefix = (chunk: string, char: string) =>
chunk
.split('\n')
.map((line: string) => `${char} ${line}`)
.join('\n');
const diffStr = diffObjs
.map((part: any) => {
return part.added
? prefix(part.value, '+')
: part.removed
? prefix(part.value, '-')
: prefix(part.value, ' ');
})
.join('\n');
return diffStr;
}
export function loadPackageJson() {
return require(path.join(ROOT, 'package.json'));
}
/**
* Error handling wrapper around the npm `which` package:
* "Like the unix which utility. Finds the first instance of a specified
* executable in the PATH environment variable. Does not cache the results,
* so hash -r is not needed when the PATH changes."
*
* @param program Basename of a program, for example 'ssh'
* @returns The program's full path, e.g. 'C:\WINDOWS\System32\OpenSSH\ssh.EXE'
*/
export async function which(program: string): Promise<string> {
const whichMod = await import('which');
let programPath: string;
try {
programPath = await whichMod(program);
} catch (err) {
if (err.code === 'ENOENT') {
throw new Error(`'${program}' program not found. Is it installed?`);
}
throw err;
}
return programPath;
}
/**
* Call which(programName) and spawn() with the given arguments. Throw an error
* if the process exit code is not zero.
*/
export async function whichSpawn(
programName: string,
args: string[] = [],
): Promise<void> {
const program = await which(programName);
let error: Error | undefined;
let exitCode: number | undefined;
try {
exitCode = await new Promise<number>((resolve, reject) => {
try {
spawn(program, args, { stdio: 'inherit' })
.on('error', reject)
.on('close', resolve);
} catch (err) {
reject(err);
}
});
} catch (err) {
error = err;
}
if (error || exitCode) {
const msg = [
`${programName} failed with exit code ${exitCode}:`,
`"${program}" [${args}]`,
];
if (error) {
msg.push(`${error}`);
}
throw new Error(msg.join('\n'));
}
}

View File

@ -1,23 +0,0 @@
#!/usr/bin/env node
// tslint:disable:no-var-requires
// We boost the threadpool size as ext2fs can deadlock with some
// operations otherwise, if the pool runs out.
process.env.UV_THREADPOOL_SIZE = '64';
// Disable oclif registering ts-node
process.env.OCLIF_TS_NODE = 0;
async function run() {
// Use fast-boot to cache require lookups, speeding up startup
await require('../build/fast-boot').start();
// Set the desired es version for downstream modules that support it
require('@balena/es-version').set('es2018');
// Run the CLI
await require('../build/app').run();
}
run();

View File

@ -1,89 +0,0 @@
#!/usr/bin/env node
// ****************************************************************************
// THIS IS FOR DEV PURPOSES ONLY AND WILL NOT BE PART OF THE PUBLISHED PACKAGE
// Before opening a PR you should build and test your changes using bin/balena
// ****************************************************************************
// tslint:disable:no-var-requires
// We boost the threadpool size as ext2fs can deadlock with some
// operations otherwise, if the pool runs out.
process.env.UV_THREADPOOL_SIZE = '64';
// Note on `fast-boot2`: We do not use `fast-boot2` with `balena-dev` because:
// * fast-boot2's cacheKiller option is configured to include the timestamps of
// the package.json and npm-shrinkwrap.json files, to avoid unexpected CLI
// behavior when changes are made to dependencies during development. This is
// generally a good thing, however, `balena-dev` (a few lines below) edits
// `package.json` to modify oclif paths, and this results in cache
// invalidation and a performance hit rather than speedup.
// * Even if the timestamps are removed from cacheKiller, so that there is no
// cache invalidation, fast-boot's speedup is barely noticeable when ts-node
// is used, e.g. 1.43s vs 1.4s when running `balena version`.
// * `fast-boot` causes unexpected behavior when used with `npm link` or
// when the `node_modules` folder is manually modified (affecting transitive
// dependencies) during development (e.g. bug investigations). A workaround
// is to use `balena-dev` without `fast-boot`. See also notes in
// `CONTRIBUTING.md`.
const path = require('path');
const rootDir = path.join(__dirname, '..');
// Allow balena-dev to work with oclif by temporarily
// pointing oclif config options to lib/ instead of build/
modifyOclifPaths();
// Undo changes on exit
process.on('exit', function () {
modifyOclifPaths(true);
});
// Undo changes in case of ctrl-c
process.on('SIGINT', function () {
modifyOclifPaths(true);
// Note process exit here will interfere with commands that do their own SIGINT handling,
// but without it commands can not be exited.
// So currently using balena-dev does not guarantee proper exit behaviour when using ctrl-c.
// Ideally a better solution is needed.
process.exit();
});
// Set the desired es version for downstream modules that support it
require('@balena/es-version').set('es2018');
// Note: before ts-node v6.0.0, 'transpile-only' (no type checking) was the
// default option. We upgraded ts-node and found that adding 'transpile-only'
// was necessary to avoid a mysterious 'null' error message. On the plus side,
// it is supposed to run faster. We still benefit from type checking when
// running 'npm run build'.
require('ts-node').register({
project: path.join(rootDir, 'tsconfig.json'),
transpileOnly: true,
});
require('../lib/app').run();
// Modify package.json oclif paths from build/ -> lib/, or vice versa
function modifyOclifPaths(revert) {
const fs = require('fs');
const packageJsonPath = path.join(rootDir, 'package.json');
const packageJson = fs.readFileSync(packageJsonPath, 'utf8');
const packageObj = JSON.parse(packageJson);
if (!packageObj.oclif) {
return;
}
let oclifSectionText = JSON.stringify(packageObj.oclif);
if (!revert) {
oclifSectionText = oclifSectionText.replace(/\/build\//g, '/lib/');
} else {
oclifSectionText = oclifSectionText.replace(/\/lib\//g, '/build/');
}
packageObj.oclif = JSON.parse(oclifSectionText);
fs.writeFileSync(
packageJsonPath,
`${JSON.stringify(packageObj, null, 2)}\n`,
'utf8',
);
}

7
bin/resin Executable file
View File

@ -0,0 +1,7 @@
#!/usr/bin/env node
// We boost the threadpool size as ext2fs can deadlock with some
// operations otherwise, if the pool runs out.
process.env.UV_THREADPOOL_SIZE = '64';
require('../build/app');

154
capitanodoc.ts Normal file
View File

@ -0,0 +1,154 @@
export = {
title: 'Resin CLI Documentation',
introduction: `\
This tool allows you to interact with the resin.io api from the comfort of your command line.
Please make sure your system meets the requirements as specified in the [README](https://github.com/resin-io/resin-cli).
## Install the CLI
### Npm install
The best supported way to install the CLI is from npm:
$ npm install resin-cli -g --production --unsafe-perm
\`--unsafe-perm\` is only required on systems where the global install directory is not user-writable.
This allows npm install steps to download and save prebuilt native binaries. You may be able to omit it,
especially if you're using a user-managed node install such as [nvm](https://github.com/creationix/nvm).
### Standalone install
Alternatively, if you don't have a node or pre-gyp environment, you can still install the CLI as a standalone
binary. **This is in experimental and may not work perfectly yet in all environments**, but works well in
initial cross-platform testing, so it may be useful, and we'd love your feedback if you hit any issues.
To install the CLI as a standalone binary:
* Download the latest zip for your OS from https://github.com/resin-io/resin-cli/releases.
* Extract the contents, putting the \`resin-cli\` folder somewhere appropriate for your system (e.g. \`C:/resin-cli\`, \`/usr/local/lib/resin-cli\`, etc).
* Add the \`resin-cli\` folder to your \`PATH\`. (
[Windows instructions](https://www.computerhope.com/issues/ch000549.htm),
[Linux instructions](https://stackoverflow.com/questions/14637979/how-to-permanently-set-path-on-linux-unix),
[OSX instructions](https://stackoverflow.com/questions/22465332/setting-path-environment-variable-in-osx-permanently))
* Running \`resin\` in a fresh command line should print the resin CLI help.
To update in future, simply download a new release and replace the extracted folder.
Have any problems, or see any unexpected behaviour? Please file an issue!
## Getting started
Once you have the CLI installed, you'll need to log in, so it can access everything in your resin.io account.
To authenticate yourself, run:
$ resin login
You now have access to all the commands referenced below.
## Proxy support
The CLI does support HTTP(S) proxies.
You can configure the proxy using several methods (in order of their precedence):
* set the \`RESINRC_PROXY\` environment variable in the URL format (with protocol, host, port, and optionally the basic auth),
* use the [resin config file](https://www.npmjs.com/package/resin-settings-client#documentation) (project-specific or user-level)
and set the \`proxy\` setting. This can be:
* a string in the URL format,
* or an object following [this format](https://www.npmjs.com/package/global-tunnel-ng#options), which allows more control,
* or set the conventional \`https_proxy\` / \`HTTPS_PROXY\` / \`http_proxy\` / \`HTTP_PROXY\`
environment variable (in the same standard URL format).\
`,
categories: [
{
title: 'Api keys',
files: [ 'build/actions/api-key.js' ],
},
{
title: 'Application',
files: [ 'build/actions/app.js' ]
},
{
title: 'Authentication',
files: [ 'build/actions/auth.js' ]
},
{
title: 'Device',
files: [ 'build/actions/device.js' ]
},
{
title: 'Environment Variables',
files: [ 'build/actions/environment-variables.js' ]
},
{
title: 'Help',
files: [ 'build/actions/help.js' ]
},
{
title: 'Information',
files: [ 'build/actions/info.js' ]
},
{
title: 'Keys',
files: [ 'build/actions/keys.js' ]
},
{
title: 'Logs',
files: [ 'build/actions/logs.js' ]
},
{
title: 'Sync',
files: [ 'build/actions/sync.js' ]
},
{
title: 'SSH',
files: [ 'build/actions/ssh.js' ]
},
{
title: 'Notes',
files: [ 'build/actions/notes.js' ]
},
{
title: 'OS',
files: [ 'build/actions/os.js' ]
},
{
title: 'Config',
files: [ 'build/actions/config.js' ]
},
{
title: 'Preload',
files: [ 'build/actions/preload.js' ]
},
{
title: 'Push',
files: [ 'build/actions/push.js' ]
},
{
title: 'Settings',
files: [ 'build/actions/settings.js' ]
},
{
title: 'Wizard',
files: [ 'build/actions/wizard.js' ]
},
{
title: 'Local',
files: [ 'build/actions/local/index.js' ]
},
{
title: 'Deploy',
files: [
'build/actions/build.js',
'build/actions/deploy.js'
]
},
{
title: 'Utilities',
files: [ 'build/actions/util.js' ]
},
]
};

127
coffeelint.json Normal file
View File

@ -0,0 +1,127 @@
{
"coffeescript_error": {
"level": "error"
},
"arrow_spacing": {
"name": "arrow_spacing",
"level": "error"
},
"no_tabs": {
"name": "no_tabs",
"level": "ignore"
},
"no_trailing_whitespace": {
"name": "no_trailing_whitespace",
"level": "error",
"allowed_in_comments": false,
"allowed_in_empty_lines": false
},
"max_line_length": {
"name": "max_line_length",
"value": 120,
"level": "error",
"limitComments": true
},
"line_endings": {
"name": "line_endings",
"level": "ignore",
"value": "unix"
},
"no_trailing_semicolons": {
"name": "no_trailing_semicolons",
"level": "error"
},
"indentation": {
"name": "indentation",
"value": 1,
"level": "error"
},
"camel_case_classes": {
"name": "camel_case_classes",
"level": "error"
},
"colon_assignment_spacing": {
"name": "colon_assignment_spacing",
"level": "error",
"spacing": {
"left": 0,
"right": 1
}
},
"no_implicit_braces": {
"name": "no_implicit_braces",
"level": "ignore",
"strict": false
},
"no_plusplus": {
"name": "no_plusplus",
"level": "ignore"
},
"no_throwing_strings": {
"name": "no_throwing_strings",
"level": "error"
},
"no_backticks": {
"name": "no_backticks",
"level": "error"
},
"no_implicit_parens": {
"name": "no_implicit_parens",
"strict": false,
"level": "ignore"
},
"no_empty_param_list": {
"name": "no_empty_param_list",
"level": "error"
},
"no_stand_alone_at": {
"name": "no_stand_alone_at",
"level": "ignore"
},
"space_operators": {
"name": "space_operators",
"level": "error"
},
"duplicate_key": {
"name": "duplicate_key",
"level": "error"
},
"empty_constructor_needs_parens": {
"name": "empty_constructor_needs_parens",
"level": "ignore"
},
"cyclomatic_complexity": {
"name": "cyclomatic_complexity",
"value": 10,
"level": "ignore"
},
"newlines_after_classes": {
"name": "newlines_after_classes",
"value": 3,
"level": "ignore"
},
"no_unnecessary_fat_arrows": {
"name": "no_unnecessary_fat_arrows",
"level": "error"
},
"missing_fat_arrows": {
"name": "missing_fat_arrows",
"level": "ignore"
},
"non_empty_constructor_needs_parens": {
"name": "non_empty_constructor_needs_parens",
"level": "ignore"
},
"no_unnecessary_double_quotes": {
"name": "no_unnecessary_double_quotes",
"level": "error"
},
"no_debugger": {
"name": "no_debugger",
"level": "warn"
},
"no_interpolation_in_single_quotes": {
"name": "no_interpolation_in_single_quotes",
"level": "error"
}
}

View File

@ -1,83 +0,0 @@
#compdef balena
#autoload
#GENERATED FILE DON'T MODIFY#
_balena() {
typeset -A opt_args
local context state line curcontext="$curcontext"
# Valid top-level completions
main_commands=( build deploy envs fleets join keys leave login logout logs note orgs preload push releases scan settings ssh support tags tunnel version whoami api-key config device device devices env fleet fleet internal key key local os release release tag util )
# Sub-completions
api_key_cmds=( generate )
config_cmds=( generate inject read reconfigure write )
device_cmds=( deactivate identify init local-mode move os-update pin public-url purge reboot register rename restart rm services shutdown track-fleet )
devices_cmds=( supported )
env_cmds=( add rename rm )
fleet_cmds=( create pin purge rename restart rm track-latest )
internal_cmds=( osinit )
key_cmds=( add rm )
local_cmds=( configure flash )
os_cmds=( build-config configure download initialize versions )
release_cmds=( finalize invalidate validate )
tag_cmds=( rm set )
_arguments -C \
'(- 1 *)--version[show version and exit]' \
'(- 1 *)'{-h,--help}'[show help options and exit]' \
'1:first command:_balena_main_cmds' \
'2:second command:_balena_sec_cmds' \
&& ret=0
}
(( $+functions[_balena_main_cmds] )) ||
_balena_main_cmds() {
_describe -t main_commands 'command' main_commands "$@" && ret=0
}
(( $+functions[_balena_sec_cmds] )) ||
_balena_sec_cmds() {
case $line[1] in
"api-key")
_describe -t api_key_cmds 'api-key_cmd' api_key_cmds "$@" && ret=0
;;
"config")
_describe -t config_cmds 'config_cmd' config_cmds "$@" && ret=0
;;
"device")
_describe -t device_cmds 'device_cmd' device_cmds "$@" && ret=0
;;
"devices")
_describe -t devices_cmds 'devices_cmd' devices_cmds "$@" && ret=0
;;
"env")
_describe -t env_cmds 'env_cmd' env_cmds "$@" && ret=0
;;
"fleet")
_describe -t fleet_cmds 'fleet_cmd' fleet_cmds "$@" && ret=0
;;
"internal")
_describe -t internal_cmds 'internal_cmd' internal_cmds "$@" && ret=0
;;
"key")
_describe -t key_cmds 'key_cmd' key_cmds "$@" && ret=0
;;
"local")
_describe -t local_cmds 'local_cmd' local_cmds "$@" && ret=0
;;
"os")
_describe -t os_cmds 'os_cmd' os_cmds "$@" && ret=0
;;
"release")
_describe -t release_cmds 'release_cmd' release_cmds "$@" && ret=0
;;
"tag")
_describe -t tag_cmds 'tag_cmd' tag_cmds "$@" && ret=0
;;
esac
}
_balena "$@"

View File

@ -1,80 +0,0 @@
#!/bin/bash
#GENERATED FILE DON'T MODIFY#
_balena_complete()
{
local cur prev
# Valid top-level completions
main_commands="build deploy envs fleets join keys leave login logout logs note orgs preload push releases scan settings ssh support tags tunnel version whoami api-key config device device devices env fleet fleet internal key key local os release release tag util"
# Sub-completions
api_key_cmds="generate"
config_cmds="generate inject read reconfigure write"
device_cmds="deactivate identify init local-mode move os-update pin public-url purge reboot register rename restart rm services shutdown track-fleet"
devices_cmds="supported"
env_cmds="add rename rm"
fleet_cmds="create pin purge rename restart rm track-latest"
internal_cmds="osinit"
key_cmds="add rm"
local_cmds="configure flash"
os_cmds="build-config configure download initialize versions"
release_cmds="finalize invalidate validate"
tag_cmds="rm set"
COMPREPLY=()
cur=${COMP_WORDS[COMP_CWORD]}
prev=${COMP_WORDS[COMP_CWORD-1]}
if [ $COMP_CWORD -eq 1 ]
then
COMPREPLY=( $(compgen -W "${main_commands}" -- $cur) )
elif [ $COMP_CWORD -eq 2 ]
then
case "$prev" in
api-key)
COMPREPLY=( $(compgen -W "$api_key_cmds" -- $cur) )
;;
config)
COMPREPLY=( $(compgen -W "$config_cmds" -- $cur) )
;;
device)
COMPREPLY=( $(compgen -W "$device_cmds" -- $cur) )
;;
devices)
COMPREPLY=( $(compgen -W "$devices_cmds" -- $cur) )
;;
env)
COMPREPLY=( $(compgen -W "$env_cmds" -- $cur) )
;;
fleet)
COMPREPLY=( $(compgen -W "$fleet_cmds" -- $cur) )
;;
internal)
COMPREPLY=( $(compgen -W "$internal_cmds" -- $cur) )
;;
key)
COMPREPLY=( $(compgen -W "$key_cmds" -- $cur) )
;;
local)
COMPREPLY=( $(compgen -W "$local_cmds" -- $cur) )
;;
os)
COMPREPLY=( $(compgen -W "$os_cmds" -- $cur) )
;;
release)
COMPREPLY=( $(compgen -W "$release_cmds" -- $cur) )
;;
tag)
COMPREPLY=( $(compgen -W "$tag_cmds" -- $cur) )
;;
"*")
;;
esac
fi
}
complete -F _balena_complete balena

View File

@ -1,175 +0,0 @@
/**
* @license
* Copyright 2021 Balena Ltd.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
const path = require('path');
const rootDir = path.join(__dirname, '..');
const fs = require('fs');
const manifestFile = 'oclif.manifest.json';
commandsFilePath = path.join(rootDir, manifestFile);
if (fs.existsSync(commandsFilePath)) {
console.log('Generating shell auto completion files...');
} else {
console.error(`generate-completion.js: Could not find "${manifestFile}"`);
process.exitCode = 1;
return;
}
const commandsJson = JSON.parse(fs.readFileSync(commandsFilePath, 'utf8'));
var mainCommands = [];
var additionalCommands = [];
for (const key of Object.keys(commandsJson.commands)) {
const cmd = key.split(':');
if (cmd.length > 1) {
additionalCommands.push(cmd);
if (!mainCommands.includes(cmd[0])) {
mainCommands.push(cmd[0]);
}
} else {
mainCommands.push(cmd[0]);
}
}
const mainCommandsStr = mainCommands.join(' ');
// GENERATE BASH COMPLETION FILE
bashFilePathIn = path.join(__dirname, '/templates/bash.template');
bashFilePathOut = path.join(__dirname, 'balena-completion.bash');
try {
fs.unlinkSync(bashFilePathOut);
} catch (error) {
process.exitCode = 1;
return console.error(error);
}
fs.readFile(bashFilePathIn, 'utf8', function (err, data) {
if (err) {
process.exitCode = 1;
return console.error(err);
}
data = data.replace(
'#TEMPLATE FILE FOR BASH COMPLETION#',
"#GENERATED FILE DON'T MODIFY#",
);
data = data.replace(
/\$main_commands\$/g,
'main_commands="' + mainCommandsStr + '"',
);
var subCommands = [];
var prevElement = additionalCommands[0][0];
additionalCommands.forEach(function (element) {
if (element[0] === prevElement) {
subCommands.push(element[1]);
} else {
const prevElement2 = prevElement.replace(/-/g, '_') + '_cmds';
data = data.replace(
/\$sub_cmds\$/g,
' ' + prevElement2 + '="' + subCommands.join(' ') + '"\n$sub_cmds$',
);
data = data.replace(
/\$sub_cmds_prev\$/g,
' ' +
prevElement +
')\n COMPREPLY=( $(compgen -W "$' +
prevElement2 +
'" -- $cur) )\n ;;\n$sub_cmds_prev$',
);
prevElement = element[0];
subCommands = [];
subCommands.push(element[1]);
}
});
// cleanup placeholders
data = data.replace(/\$sub_cmds\$/g, '');
data = data.replace(/\$sub_cmds_prev\$/g, '');
fs.writeFile(bashFilePathOut, data, 'utf8', function (error) {
if (error) {
process.exitCode = 1;
return console.error(error);
}
});
});
// GENERATE ZSH COMPLETION FILE
zshFilePathIn = path.join(__dirname, '/templates/zsh.template');
zshFilePathOut = path.join(__dirname, '_balena');
try {
fs.unlinkSync(zshFilePathOut);
} catch (error) {
process.exitCode = 1;
return console.error(error);
}
fs.readFile(zshFilePathIn, 'utf8', function (err, data) {
if (err) {
process.exitCode = 1;
return console.error(err);
}
data = data.replace(
'#TEMPLATE FILE FOR ZSH COMPLETION#',
"#GENERATED FILE DON'T MODIFY#",
);
data = data.replace(
/\$main_commands\$/g,
'main_commands=( ' + mainCommandsStr + ' )',
);
var subCommands = [];
var prevElement = additionalCommands[0][0];
additionalCommands.forEach(function (element) {
if (element[0] === prevElement) {
subCommands.push(element[1]);
} else {
const prevElement2 = prevElement.replace(/-/g, '_') + '_cmds';
data = data.replace(
/\$sub_cmds\$/g,
' ' + prevElement2 + '=( ' + subCommands.join(' ') + ' )\n$sub_cmds$',
);
data = data.replace(
/\$sub_cmds_prev\$/g,
' "' +
prevElement +
'")\n _describe -t ' +
prevElement2 +
" '" +
prevElement +
"_cmd' " +
prevElement2 +
' "$@" && ret=0\n ;;\n$sub_cmds_prev$',
);
prevElement = element[0];
subCommands = [];
subCommands.push(element[1]);
}
});
// cleanup placeholders
data = data.replace(/\$sub_cmds\$/g, '');
data = data.replace(/\$sub_cmds_prev\$/g, '');
fs.writeFile(zshFilePathOut, data, 'utf8', function (error) {
if (error) {
process.exitCode = 1;
return console.error(error);
}
});
});

View File

@ -1,32 +0,0 @@
#!/bin/bash
#TEMPLATE FILE FOR BASH COMPLETION#
_balena_complete()
{
local cur prev
# Valid top-level completions
$main_commands$
# Sub-completions
$sub_cmds$
COMPREPLY=()
cur=${COMP_WORDS[COMP_CWORD]}
prev=${COMP_WORDS[COMP_CWORD-1]}
if [ $COMP_CWORD -eq 1 ]
then
COMPREPLY=( $(compgen -W "${main_commands}" -- $cur) )
elif [ $COMP_CWORD -eq 2 ]
then
case "$prev" in
$sub_cmds_prev$
"*")
;;
esac
fi
}
complete -F _balena_complete balena

View File

@ -1,35 +0,0 @@
#compdef balena
#autoload
#TEMPLATE FILE FOR ZSH COMPLETION#
_balena() {
typeset -A opt_args
local context state line curcontext="$curcontext"
# Valid top-level completions
$main_commands$
# Sub-completions
$sub_cmds$
_arguments -C \
'(- 1 *)--version[show version and exit]' \
'(- 1 *)'{-h,--help}'[show help options and exit]' \
'1:first command:_balena_main_cmds' \
'2:second command:_balena_sec_cmds' \
&& ret=0
}
(( $+functions[_balena_main_cmds] )) ||
_balena_main_cmds() {
_describe -t main_commands 'command' main_commands "$@" && ret=0
}
(( $+functions[_balena_sec_cmds] )) ||
_balena_sec_cmds() {
case $line[1] in
$sub_cmds_prev$
esac
}
_balena "$@"

112
doc/automated-init.md Normal file
View File

@ -0,0 +1,112 @@
# Provisioning Resin.io devices in automated (non-interactive) mode
This document describes how to run the `device init` command in non-interactive mode.
It requires collecting some preliminary information _once_.
The final command to provision the device looks like this:
```bash
resin device init --app APP_ID --os-version OS_VERSION --drive DRIVE --config CONFIG_FILE --yes
```
You can run this command as many times as you need, putting the new medium (SD card / USB stick) each time.
But before you can run it you need to collect the parameters and build the configuration file. Keep reading to figure out how to do it.
## Collect all the required parameters.
1. `DEVICE_TYPE`. Run
```bash
resin devices supported
```
and find the _slug_ for your target device type, like _raspberrypi3_.
1. `APP_ID`. Create an application (`resin app create APP_NAME --type DEVICE_TYPE`) or find an existing one (`resin apps`) and notice its ID.
1. `OS_VERSION`. Run
```bash
resin os versions DEVICE_TYPE
```
and pick the version that you need, like _v2.0.6+rev1.prod_.
_Note_ that even though we support _semver ranges_ it's recommended to use the exact version when doing the automated provisioning as it
guarantees full compatibility between the steps.
1. `DRIVE`. Plug in your target medium (SD card or the USB stick, depending on your device type) and run
```bash
resin util available-drives
```
and get the drive name, like _/dev/sdb_ or _/dev/mmcblk0_.
The resin CLI will not display the system drives to protect you,
but still please check very carefully that you've picked the correct drive as it will be erased during the provisioning process.
Now we have all the parameters -- time to build the config file.
## Build the config file
Interactive device provisioning process often includes collecting some extra device configuration, like the networking mode and wifi credentials.
To skip this interactive step we need to buid this configuration once and save it to the JSON file for later reuse.
Let's say we will place it into the `CONFIG_FILE` path, like _./resin-os/raspberrypi3-config.json_.
We also need to put the OS image somewhere, let's call this path `OS_IMAGE_PATH`, it can be something like _./resin-os/raspberrypi3-v2.0.6+rev1.prod.img_.
1. First we need to download the OS image once. That's needed for building the config, and will speedup the subsequent operations as the downloaded OS image is placed into the local cache.
Run:
```bash
resin os download DEVICE_TYPE --output OS_IMAGE_PATH --version OS_VERSION
```
1. Now we're ready to build the config:
```bash
resin os build-config OS_IMAGE_PATH DEVICE_TYPE --output CONFIG_FILE
```
This will run you through the interactive configuration wizard and in the end save the generated config as `CONFIG_FILE`. You can then verify it's not empty:
```bash
cat CONFIG_FILE
```
## Done
Now you're ready to run the command in the beginning of this guide.
Please note again that all of these steps only need to be done once (unless you need to change something), and once all the parameters are collected the main init command can be run unchanged.
But there are still some nuances to cover, please read below.
## Nuances
### `sudo` password on *nix systems
In order to write the image to the raw device we need the root permissions, this is unavoidable.
To improve the security we only run the minimal subcommand with `sudo`.
This means that with the default setup you're interrupted closer to the end of the device init process to enter your sudo password for this subcommand to work.
There are several ways to eliminate it and make the process fully non-interactive.
#### Option 1: make passwordless sudo.
Obviously you shouldn't do that if the machine you're working on has access to any sensitive resources or information.
But if you're using a machine dedicated to resin provisioning this can be fine, and also the simplest thing to do.
#### Option 2: `NOPASSWD` directive
You can configure the `resin` CLI command to be sudo-runnable without the password. Check [this post](https://askubuntu.com/questions/159007/how-do-i-run-specific-sudo-commands-without-a-password) for an example.
### Extra initialization config
As of June 2017 all the supported devices should not require any other interactive configuration.
But by the design of our system it is _possible_ (though it doesn't look very likely it's going to happen any time soon) that some extra initialization options may be requested for the specific device types.
If that is the case please raise the issue in the resin CLI repository and the maintainers will add the necessary options to build the similar JSON config for this step.

1727
doc/cli.markdown Normal file

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

40
gulpfile.coffee Normal file
View File

@ -0,0 +1,40 @@
path = require('path')
gulp = require('gulp')
coffee = require('gulp-coffee')
inlinesource = require('gulp-inline-source')
mocha = require('gulp-mocha')
shell = require('gulp-shell')
packageJSON = require('./package.json')
OPTIONS =
files:
coffee: [ 'lib/**/*.coffee', 'gulpfile.coffee' ]
app: 'lib/**/*.coffee'
tests: 'tests/**/*.spec.coffee'
pages: 'lib/auth/pages/*.ejs'
directories:
build: 'build/'
gulp.task 'pages', ->
gulp.src(OPTIONS.files.pages)
.pipe(inlinesource())
.pipe(gulp.dest('build/auth/pages'))
gulp.task 'coffee', ->
gulp.src(OPTIONS.files.app)
.pipe(coffee(bare: true, header: true))
.pipe(gulp.dest(OPTIONS.directories.build))
gulp.task 'test', ->
gulp.src(OPTIONS.files.tests, read: false)
.pipe(mocha({
reporter: 'spec'
}))
gulp.task 'build', [
'coffee',
'pages'
]
gulp.task 'watch', [ 'build' ], ->
gulp.watch([ OPTIONS.files.coffee ], [ 'build' ])

36
lib/actions/api-key.ts Normal file
View File

@ -0,0 +1,36 @@
import { CommandDefinition } from 'capitano';
import { stripIndent } from 'common-tags';
export const generate: CommandDefinition<{
name: string;
}> = {
signature: 'api-key generate <name>',
description: 'Generate a new API key with the given name',
help: stripIndent`
This command generates a new API key for the current user, with the given
name. The key will be logged to the console.
This key can be used to log into the CLI using 'resin login --token <key>',
or to authenticate requests to the API with an 'Authorization: Bearer <key>' header.
Examples:
$ resin api-key generate "Jenkins Key"
`,
async action(params, _options, done) {
const resin = (await import('resin-sdk')).fromSharedOptions();
resin.models.apiKey
.create(params.name)
.then(key => {
console.log(stripIndent`
Registered api key '${params.name}':
${key}
This key will not be shown again, so please save it now.
`);
})
.finally(done);
},
};

157
lib/actions/app.coffee Normal file
View File

@ -0,0 +1,157 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
commandOptions = require('./command-options')
exports.create =
signature: 'app create <name>'
description: 'create an application'
help: '''
Use this command to create a new resin.io application.
You can specify the application device type with the `--type` option.
Otherwise, an interactive dropdown will be shown for you to select from.
You can see a list of supported device types with
$ resin devices supported
Examples:
$ resin app create MyApp
$ resin app create MyApp --type raspberry-pi
'''
options: [
{
signature: 'type'
parameter: 'type'
description: 'application device type (Check available types with `resin devices supported`)'
alias: 't'
}
]
permission: 'user'
action: (params, options, done) ->
resin = require('resin-sdk-preconfigured')
patterns = require('../utils/patterns')
# Validate the the application name is available
# before asking the device type.
# https://github.com/resin-io/resin-cli/issues/30
resin.models.application.has(params.name).then (hasApplication) ->
if hasApplication
patterns.exitWithExpectedError('You already have an application with that name!')
.then ->
return options.type or patterns.selectDeviceType()
.then (deviceType) ->
return resin.models.application.create(params.name, deviceType)
.then (application) ->
console.info("Application created: #{application.app_name} (#{application.device_type}, id #{application.id})")
.nodeify(done)
exports.list =
signature: 'apps'
description: 'list all applications'
help: '''
Use this command to list all your applications.
Notice this command only shows the most important bits of information for each app.
If you want detailed information, use resin app <name> instead.
Examples:
$ resin apps
'''
permission: 'user'
primary: true
action: (params, options, done) ->
resin = require('resin-sdk-preconfigured')
visuals = require('resin-cli-visuals')
resin.models.application.getAll().then (applications) ->
console.log visuals.table.horizontal applications, [
'id'
'app_name'
'device_type'
'online_devices'
'devices_length'
]
.nodeify(done)
exports.info =
signature: 'app <name>'
description: 'list a single application'
help: '''
Use this command to show detailed information for a single application.
Examples:
$ resin app MyApp
'''
permission: 'user'
primary: true
action: (params, options, done) ->
resin = require('resin-sdk-preconfigured')
visuals = require('resin-cli-visuals')
resin.models.application.get(params.name).then (application) ->
console.log visuals.table.vertical application, [
"$#{application.app_name}$"
'id'
'device_type'
'git_repository'
'commit'
]
.nodeify(done)
exports.restart =
signature: 'app restart <name>'
description: 'restart an application'
help: '''
Use this command to restart all devices that belongs to a certain application.
Examples:
$ resin app restart MyApp
'''
permission: 'user'
action: (params, options, done) ->
resin = require('resin-sdk-preconfigured')
resin.models.application.restart(params.name).nodeify(done)
exports.remove =
signature: 'app rm <name>'
description: 'remove an application'
help: '''
Use this command to remove a resin.io application.
Notice this command asks for confirmation interactively.
You can avoid this by passing the `--yes` boolean option.
Examples:
$ resin app rm MyApp
$ resin app rm MyApp --yes
'''
options: [ commandOptions.yes ]
permission: 'user'
action: (params, options, done) ->
resin = require('resin-sdk-preconfigured')
patterns = require('../utils/patterns')
patterns.confirm(options.yes, 'Are you sure you want to delete the application?').then ->
resin.models.application.remove(params.name)
.nodeify(done)

210
lib/actions/auth.coffee Normal file
View File

@ -0,0 +1,210 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
exports.login =
signature: 'login'
description: 'login to resin.io'
help: '''
Use this command to login to your resin.io account.
This command will prompt you to login using the following login types:
- Web authorization: open your web browser and prompt you to authorize the CLI
from the dashboard.
- Credentials: using email/password and 2FA.
- Token: using a session token or API key (experimental) from the preferences page.
Examples:
$ resin login
$ resin login --web
$ resin login --token "..."
$ resin login --credentials
$ resin login --credentials --email johndoe@gmail.com --password secret
'''
options: [
{
signature: 'token'
description: 'session token or API key (experimental)'
parameter: 'token'
alias: 't'
}
{
signature: 'web'
description: 'web-based login'
boolean: true
alias: 'w'
}
{
signature: 'credentials'
description: 'credential-based login'
boolean: true
alias: 'c'
}
{
signature: 'email'
parameter: 'email'
description: 'email'
alias: [ 'e', 'u' ]
}
{
signature: 'password'
parameter: 'password'
description: 'password'
alias: 'p'
}
]
primary: true
action: (params, options, done) ->
_ = require('lodash')
Promise = require('bluebird')
resin = require('resin-sdk').fromSharedOptions()
auth = require('../auth')
form = require('resin-cli-form')
patterns = require('../utils/patterns')
messages = require('../utils/messages')
login = (options) ->
if options.token?
return Promise.try ->
return options.token if _.isString(options.token)
return form.ask
message: 'Session token or API key (experimental) from the preferences page'
name: 'token'
type: 'input'
.then(resin.auth.loginWithToken)
.tap ->
resin.auth.whoami()
.then (username) ->
if !username
patterns.exitWithExpectedError('Token authentication failed')
else if options.credentials
return patterns.authenticate(options)
else if options.web
console.info('Connecting to the web dashboard')
return auth.login()
return patterns.askLoginType().then (loginType) ->
if loginType is 'register'
{ runCommand } = require('../utils/helpers')
return runCommand('signup')
options[loginType] = true
return login(options)
resin.settings.get('resinUrl').then (resinUrl) ->
console.log(messages.resinAsciiArt)
console.log("\nLogging in to #{resinUrl}")
return login(options)
.then(resin.auth.whoami)
.tap (username) ->
console.info("Successfully logged in as: #{username}")
console.info """
Find out about the available commands by running:
$ resin help
#{messages.reachingOut}
"""
.nodeify(done)
exports.logout =
signature: 'logout'
description: 'logout from resin.io'
help: '''
Use this command to logout from your resin.io account.o
Examples:
$ resin logout
'''
action: (params, options, done) ->
resin = require('resin-sdk').fromSharedOptions()
resin.auth.logout().nodeify(done)
exports.signup =
signature: 'signup'
description: 'signup to resin.io'
help: '''
Use this command to signup for a resin.io account.
If signup is successful, you'll be logged in to your new user automatically.
Examples:
$ resin signup
Email: johndoe@acme.com
Password: ***********
$ resin whoami
johndoe
'''
action: (params, options, done) ->
resin = require('resin-sdk').fromSharedOptions()
form = require('resin-cli-form')
validation = require('../utils/validation')
resin.settings.get('resinUrl').then (resinUrl) ->
console.log("\nRegistering to #{resinUrl}")
form.run [
message: 'Email:'
name: 'email'
type: 'input'
validate: validation.validateEmail
,
message: 'Password:'
name: 'password'
type: 'password',
validate: validation.validatePassword
]
.then(resin.auth.register)
.then(resin.auth.loginWithToken)
.nodeify(done)
exports.whoami =
signature: 'whoami'
description: 'get current username and email address'
help: '''
Use this command to find out the current logged in username and email address.
Examples:
$ resin whoami
'''
permission: 'user'
action: (params, options, done) ->
Promise = require('bluebird')
resin = require('resin-sdk').fromSharedOptions()
visuals = require('resin-cli-visuals')
Promise.props
username: resin.auth.whoami()
email: resin.auth.getEmail()
url: resin.settings.get('resinUrl')
.then (results) ->
console.log visuals.table.vertical results, [
'$account information$'
'username'
'email'
'url'
]
.nodeify(done)

145
lib/actions/build.coffee Normal file
View File

@ -0,0 +1,145 @@
# Imported here because it's needed for the setup
# of this action
Promise = require('bluebird')
dockerUtils = require('../utils/docker')
compose = require('../utils/compose')
###
Opts must be an object with the following keys:
app: the app this build is for (optional)
arch: the architecture to build for
deviceType: the device type to build for
buildEmulated
buildOpts: arguments to forward to docker build command
###
buildProject = (docker, logger, composeOpts, opts) ->
compose.loadProject(
logger
composeOpts.projectPath
composeOpts.projectName
)
.then (project) ->
appType = opts.app?.application_type?[0]
if appType? and project.descriptors.length > 1 and not appType.supports_multicontainer
logger.logWarn(
'Target application does not support multiple containers.\n' +
'Continuing with build, but you will not be able to deploy.'
)
compose.buildProject(
docker
logger
project.path
project.name
project.composition
opts.arch
opts.deviceType
opts.buildEmulated
opts.buildOpts
composeOpts.inlineLogs
)
.then ->
logger.logSuccess('Build succeeded!')
.tapCatch (e) ->
logger.logError('Build failed')
module.exports =
signature: 'build [source]'
description: 'Build a single image or a multicontainer project locally'
primary: true
help: '''
Use this command to build an image or a complete multicontainer project
with the provided docker daemon.
You must provide either an application or a device-type/architecture
pair to use the resin Dockerfile pre-processor
(e.g. Dockerfile.template -> Dockerfile).
This command will look into the given source directory (or the current working
directory if one isn't specified) for a compose file. If one is found, this
command will build each service defined in the compose file. If a compose file
isn't found, the command will look for a Dockerfile, and if yet that isn't found,
it will try to generate one.
Examples:
$ resin build
$ resin build ./source/
$ resin build --deviceType raspberrypi3 --arch armhf --emulated
$ resin build --application MyApp ./source/
$ resin build --docker '/var/run/docker.sock'
$ resin build --dockerHost my.docker.host --dockerPort 2376 --ca ca.pem --key key.pem --cert cert.pem
'''
options: dockerUtils.appendOptions compose.appendOptions [
{
signature: 'arch'
parameter: 'arch'
description: 'The architecture to build for'
alias: 'A'
},
{
signature: 'deviceType'
parameter: 'deviceType'
description: 'The type of device this build is for'
alias: 'd'
},
{
signature: 'application'
parameter: 'application'
description: 'The target resin.io application this build is for'
alias: 'a'
},
]
action: (params, options, done) ->
# compositions with many services trigger misleading warnings
require('events').defaultMaxListeners = 1000
{ exitWithExpectedError } = require('../utils/patterns')
helpers = require('../utils/helpers')
Logger = require('../utils/logger')
logger = new Logger()
logger.logDebug('Parsing input...')
Promise.try ->
# `build` accepts `[source]` as a parameter, but compose expects it
# as an option. swap them here
options.source ?= params.source
delete params.source
{ application, arch, deviceType } = options
if (not (arch? and deviceType?) and not application?) or (application? and (arch? or deviceType?))
exitWithExpectedError('You must specify either an application or an arch/deviceType pair to build for')
if arch? and deviceType?
[ undefined, arch, deviceType ]
else
Promise.join(
helpers.getApplication(application)
helpers.getArchAndDeviceType(application)
(app, { arch, device_type }) ->
app.arch = arch
app.device_type = device_type
return app
)
.then (app) ->
[ app, app.arch, app.device_type ]
.then ([ app, arch, deviceType ]) ->
Promise.join(
dockerUtils.getDocker(options)
dockerUtils.generateBuildOpts(options)
compose.generateOpts(options)
(docker, buildOpts, composeOpts) ->
buildProject(docker, logger, composeOpts, {
app
arch
deviceType
buildEmulated: !!options.emulated
buildOpts
})
)
.asCallback(done)

View File

@ -0,0 +1,111 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
_ = require('lodash')
exports.yes =
signature: 'yes'
description: 'confirm non interactively'
boolean: true
alias: 'y'
exports.optionalApplication =
signature: 'application'
parameter: 'application'
description: 'application name'
alias: [ 'a', 'app' ]
exports.application = _.defaults
required: 'You have to specify an application'
, exports.optionalApplication
exports.optionalDevice =
signature: 'device'
parameter: 'device'
description: 'device uuid'
alias: 'd'
exports.optionalDeviceApiKey =
signature: 'deviceApiKey'
description: 'custom device key - note that this is only supported on ResinOS 2.0.3+'
parameter: 'device-api-key'
alias: 'k'
exports.optionalOsVersion =
signature: 'version'
description: 'a resinOS version'
parameter: 'version'
exports.booleanDevice =
signature: 'device'
description: 'device'
boolean: true
alias: 'd'
exports.osVersion =
signature: 'version'
description: """
exact version number, or a valid semver range,
or 'latest' (includes pre-releases),
or 'default' (excludes pre-releases if at least one stable version is available),
or 'recommended' (excludes pre-releases, will fail if only pre-release versions are available),
or 'menu' (will show the interactive menu)
"""
parameter: 'version'
exports.network =
signature: 'network'
parameter: 'network'
description: 'network type'
alias: 'n'
exports.wifiSsid =
signature: 'ssid'
parameter: 'ssid'
description: 'wifi ssid, if network is wifi'
alias: 's'
exports.wifiKey =
signature: 'key'
parameter: 'key'
description: 'wifi key, if network is wifi'
alias: 'k'
exports.forceUpdateLock =
signature: 'force'
description: 'force action if the update lock is set'
boolean: true
alias: 'f'
exports.drive =
signature: 'drive'
description: 'the drive to write the image to, like `/dev/sdb` or `/dev/mmcblk0`.
Careful with this as you can erase your hard drive.
Check `resin util available-drives` for available options.'
parameter: 'drive'
alias: 'd'
exports.advancedConfig =
signature: 'advanced'
description: 'show advanced configuration options'
boolean: true
alias: 'v'
exports.hostOSAccess =
signature: 'host'
boolean: true
description: 'access host OS (for devices with Resin OS >= 2.7.5)'
alias: 's'

328
lib/actions/config.coffee Normal file
View File

@ -0,0 +1,328 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
commandOptions = require('./command-options')
{ normalizeUuidProp } = require('../utils/normalization')
exports.read =
signature: 'config read'
description: 'read a device configuration'
help: '''
Use this command to read the config.json file from the mounted filesystem (e.g. SD card) of a provisioned device"
Examples:
$ resin config read --type raspberry-pi
$ resin config read --type raspberry-pi --drive /dev/disk2
'''
options: [
{
signature: 'type'
description: 'device type (Check available types with `resin devices supported`)'
parameter: 'type'
alias: 't'
required: 'You have to specify a device type'
}
{
signature: 'drive'
description: 'drive'
parameter: 'drive'
alias: 'd'
}
]
permission: 'user'
root: true
action: (params, options, done) ->
Promise = require('bluebird')
config = require('resin-config-json')
visuals = require('resin-cli-visuals')
umountAsync = Promise.promisify(require('umount').umount)
prettyjson = require('prettyjson')
Promise.try ->
return options.drive or visuals.drive('Select the device drive')
.tap(umountAsync)
.then (drive) ->
return config.read(drive, options.type)
.tap (configJSON) ->
console.info(prettyjson.render(configJSON))
.nodeify(done)
exports.write =
signature: 'config write <key> <value>'
description: 'write a device configuration'
help: '''
Use this command to write the config.json file to the mounted filesystem (e.g. SD card) of a provisioned device
Examples:
$ resin config write --type raspberry-pi username johndoe
$ resin config write --type raspberry-pi --drive /dev/disk2 username johndoe
$ resin config write --type raspberry-pi files.network/settings "..."
'''
options: [
{
signature: 'type'
description: 'device type (Check available types with `resin devices supported`)'
parameter: 'type'
alias: 't'
required: 'You have to specify a device type'
}
{
signature: 'drive'
description: 'drive'
parameter: 'drive'
alias: 'd'
}
]
permission: 'user'
root: true
action: (params, options, done) ->
Promise = require('bluebird')
_ = require('lodash')
config = require('resin-config-json')
visuals = require('resin-cli-visuals')
umountAsync = Promise.promisify(require('umount').umount)
Promise.try ->
return options.drive or visuals.drive('Select the device drive')
.tap(umountAsync)
.then (drive) ->
config.read(drive, options.type).then (configJSON) ->
console.info("Setting #{params.key} to #{params.value}")
_.set(configJSON, params.key, params.value)
return configJSON
.tap ->
return umountAsync(drive)
.then (configJSON) ->
return config.write(drive, options.type, configJSON)
.tap ->
console.info('Done')
.nodeify(done)
exports.inject =
signature: 'config inject <file>'
description: 'inject a device configuration file'
help: '''
Use this command to inject a config.json file to the mounted filesystem
(e.g. SD card or mounted resinOS image) of a provisioned device"
Examples:
$ resin config inject my/config.json --type raspberry-pi
$ resin config inject my/config.json --type raspberry-pi --drive /dev/disk2
'''
options: [
{
signature: 'type'
description: 'device type (Check available types with `resin devices supported`)'
parameter: 'type'
alias: 't'
required: 'You have to specify a device type'
}
{
signature: 'drive'
description: 'drive'
parameter: 'drive'
alias: 'd'
}
]
permission: 'user'
root: true
action: (params, options, done) ->
Promise = require('bluebird')
config = require('resin-config-json')
visuals = require('resin-cli-visuals')
umountAsync = Promise.promisify(require('umount').umount)
readFileAsync = Promise.promisify(require('fs').readFile)
Promise.try ->
return options.drive or visuals.drive('Select the device drive')
.tap(umountAsync)
.then (drive) ->
readFileAsync(params.file, 'utf8').then(JSON.parse).then (configJSON) ->
return config.write(drive, options.type, configJSON)
.tap ->
console.info('Done')
.nodeify(done)
exports.reconfigure =
signature: 'config reconfigure'
description: 'reconfigure a provisioned device'
help: '''
Use this command to reconfigure a provisioned device
Examples:
$ resin config reconfigure --type raspberry-pi
$ resin config reconfigure --type raspberry-pi --advanced
$ resin config reconfigure --type raspberry-pi --drive /dev/disk2
'''
options: [
{
signature: 'type'
description: 'device type (Check available types with `resin devices supported`)'
parameter: 'type'
alias: 't'
required: 'You have to specify a device type'
}
{
signature: 'drive'
description: 'drive'
parameter: 'drive'
alias: 'd'
}
{
signature: 'advanced'
description: 'show advanced commands'
boolean: true
alias: 'v'
}
]
permission: 'user'
root: true
action: (params, options, done) ->
Promise = require('bluebird')
config = require('resin-config-json')
visuals = require('resin-cli-visuals')
{ runCommand } = require('../utils/helpers')
umountAsync = Promise.promisify(require('umount').umount)
Promise.try ->
return options.drive or visuals.drive('Select the device drive')
.tap(umountAsync)
.then (drive) ->
config.read(drive, options.type).get('uuid')
.tap ->
umountAsync(drive)
.then (uuid) ->
configureCommand = "os configure #{drive} --device #{uuid}"
if options.advanced
configureCommand += ' --advanced'
return runCommand(configureCommand)
.then ->
console.info('Done')
.nodeify(done)
exports.generate =
signature: 'config generate'
description: 'generate a config.json file'
help: '''
Use this command to generate a config.json for a device or application.
Calling this command without --version is not recommended, and may fail in
future releases if the OS version cannot be inferred.
This is interactive by default, but you can do this automatically without interactivity
by specifying an option for each question on the command line, if you know the questions
that will be asked for the relevant device type.
Examples:
$ resin config generate --device 7cf02a6 --version 2.12.7
$ resin config generate --device 7cf02a6 --version 2.12.7 --generate-device-api-key
$ resin config generate --device 7cf02a6 --version 2.12.7 --device-api-key <existingDeviceKey>
$ resin config generate --device 7cf02a6 --version 2.12.7 --output config.json
$ resin config generate --app MyApp --version 2.12.7
$ resin config generate --app MyApp --version 2.12.7 --output config.json
$ resin config generate --app MyApp --version 2.12.7 \
--network wifi --wifiSsid mySsid --wifiKey abcdefgh --appUpdatePollInterval 1
'''
options: [
commandOptions.optionalOsVersion
commandOptions.optionalApplication
commandOptions.optionalDevice
commandOptions.optionalDeviceApiKey
{
signature: 'generate-device-api-key'
description: 'generate a fresh device key for the device'
boolean: true
}
{
signature: 'output'
description: 'output'
parameter: 'output'
alias: 'o'
}
# Options for non-interactive configuration
{
signature: 'network'
description: 'the network type to use: ethernet or wifi'
parameter: 'network'
}
{
signature: 'wifiSsid'
description: 'the wifi ssid to use (used only if --network is set to wifi)'
parameter: 'wifiSsid'
}
{
signature: 'wifiKey'
description: 'the wifi key to use (used only if --network is set to wifi)'
parameter: 'wifiKey'
}
{
signature: 'appUpdatePollInterval'
description: 'how frequently (in minutes) to poll for application updates'
parameter: 'appUpdatePollInterval'
}
]
permission: 'user'
action: (params, options, done) ->
normalizeUuidProp(options, 'device')
Promise = require('bluebird')
writeFileAsync = Promise.promisify(require('fs').writeFile)
resin = require('resin-sdk').fromSharedOptions()
form = require('resin-cli-form')
deviceConfig = require('resin-device-config')
prettyjson = require('prettyjson')
{ generateDeviceConfig, generateApplicationConfig } = require('../utils/config')
{ exitWithExpectedError } = require('../utils/patterns')
if not options.device? and not options.application?
exitWithExpectedError '''
You have to pass either a device or an application.
See the help page for examples:
$ resin help config generate
'''
Promise.try ->
if options.device?
return resin.models.device.get(options.device)
return resin.models.application.get(options.application)
.then (resource) ->
resin.models.device.getManifestBySlug(resource.device_type)
.get('options')
.then (formOptions) ->
# Pass params as an override: if there is any param with exactly the same name as a
# required option, that value is used (and the corresponding question is not asked)
form.run(formOptions, override: options)
.then (answers) ->
answers.version = options.version
if resource.uuid?
generateDeviceConfig(resource, options.deviceApiKey || options['generate-device-api-key'], answers)
else
generateApplicationConfig(resource, answers)
.then (config) ->
deviceConfig.validate(config)
if options.output?
return writeFileAsync(options.output, JSON.stringify(config))
console.log(prettyjson.render(config))
.nodeify(done)

220
lib/actions/deploy.coffee Normal file
View File

@ -0,0 +1,220 @@
# Imported here because it's needed for the setup
# of this action
Promise = require('bluebird')
dockerUtils = require('../utils/docker')
compose = require('../utils/compose')
###
Opts must be an object with the following keys:
app: the application instance to deploy to
image: the image to deploy; optional
shouldPerformBuild
shouldUploadLogs
buildEmulated
buildOpts: arguments to forward to docker build command
###
deployProject = (docker, logger, composeOpts, opts) ->
_ = require('lodash')
doodles = require('resin-doodles')
sdk = require('resin-sdk').fromSharedOptions()
compose.loadProject(
logger
composeOpts.projectPath
composeOpts.projectName
opts.image
)
.then (project) ->
if project.descriptors.length > 1 and !opts.app.application_type?[0]?.supports_multicontainer
throw new Error('Target application does not support multiple containers. Aborting!')
# find which services use images that already exist locally
Promise.map project.descriptors, (d) ->
# unconditionally build (or pull) if explicitly requested
return d if opts.shouldPerformBuild
docker.getImage(d.image.tag ? d.image).inspect()
.return(d.serviceName)
.catchReturn()
.filter (d) -> !!d
.then (servicesToSkip) ->
# multibuild takes in a composition and always attempts to
# build or pull all services. we workaround that here by
# passing a modified composition.
compositionToBuild = _.cloneDeep(project.composition)
compositionToBuild.services = _.omit(compositionToBuild.services, servicesToSkip)
if _.size(compositionToBuild.services) is 0
logger.logInfo('Everything is up to date (use --build to force a rebuild)')
return {}
compose.buildProject(
docker
logger
project.path
project.name
compositionToBuild
opts.app.arch
opts.app.device_type
opts.buildEmulated
opts.buildOpts
composeOpts.inlineLogs
)
.then (builtImages) ->
_.keyBy(builtImages, 'serviceName')
.then (builtImages) ->
project.descriptors.map (d) ->
builtImages[d.serviceName] ? {
serviceName: d.serviceName,
name: d.image.tag ? d.image
logs: 'Build skipped; image for service already exists.'
props: {}
}
.then (images) ->
if opts.app.application_type?[0]?.is_legacy
chalk = require('chalk')
legacyDeploy = require('../utils/deploy-legacy')
msg = chalk.yellow('Target application requires legacy deploy method.')
logger.logWarn(msg)
return Promise.join(
docker
logger
sdk.auth.getToken()
sdk.auth.whoami()
sdk.settings.get('resinUrl')
{
appName: opts.app.app_name
imageName: images[0].name
buildLogs: images[0].logs
shouldUploadLogs: opts.shouldUploadLogs
}
legacyDeploy
)
.then (releaseId) ->
sdk.models.release.get(releaseId, $select: [ 'commit' ])
Promise.join(
sdk.auth.getUserId()
sdk.auth.getToken()
sdk.settings.get('apiUrl')
(userId, auth, apiEndpoint) ->
compose.deployProject(
docker
logger
project.composition
images
opts.app.id
userId
"Bearer #{auth}"
apiEndpoint
!opts.shouldUploadLogs
)
)
.then (release) ->
logger.logSuccess('Deploy succeeded!')
logger.logSuccess("Release: #{release.commit}")
console.log()
console.log(doodles.getDoodle()) # Show charlie
console.log()
.tapCatch (e) ->
logger.logError('Deploy failed')
module.exports =
signature: 'deploy <appName> [image]'
description: 'Deploy a single image or a multicontainer project to a resin.io application'
help: '''
Use this command to deploy an image or a complete multicontainer project
to an application, optionally building it first.
Usage: `deploy <appName> ([image] | --build [--source build-dir])`
Unless an image is specified, this command will look into the current directory
(or the one specified by --source) for a compose file. If one is found, this
command will deploy each service defined in the compose file, building it first
if an image for it doesn't exist. If a compose file isn't found, the command
will look for a Dockerfile, and if yet that isn't found, it will try to
generate one.
To deploy to an app on which you're a collaborator, use
`resin deploy <appOwnerUsername>/<appName>`.
Note: If building with this command, all options supported by `resin build`
are also supported with this command.
Examples:
$ resin deploy myApp
$ resin deploy myApp --build --source myBuildDir/
$ resin deploy myApp myApp/myImage
'''
permission: 'user'
primary: true
options: dockerUtils.appendOptions compose.appendOptions [
{
signature: 'source'
parameter: 'source'
description: 'Specify an alternate source directory; default is the working directory'
alias: 's'
},
{
signature: 'build'
boolean: true
description: 'Force a rebuild before deploy'
alias: 'b'
},
{
signature: 'nologupload'
description: "Don't upload build logs to the dashboard with image (if building)"
boolean: true
}
]
action: (params, options, done) ->
# compositions with many services trigger misleading warnings
require('events').defaultMaxListeners = 1000
helpers = require('../utils/helpers')
Logger = require('../utils/logger')
logger = new Logger()
logger.logDebug('Parsing input...')
Promise.try ->
{ appName, image } = params
# look into "resin build" options if appName isn't given
appName = options.application if not appName?
delete options.application
if not appName?
throw new Error('Please specify the name of the application to deploy')
if image? and options.build
throw new Error('Build option is not applicable when specifying an image')
Promise.join(
helpers.getApplication(appName)
helpers.getArchAndDeviceType(appName)
(app, { arch, device_type }) ->
app.arch = arch
app.device_type = device_type
return app
)
.then (app) ->
[ app, image, !!options.build, !options.nologupload ]
.then ([ app, image, shouldPerformBuild, shouldUploadLogs ]) ->
Promise.join(
dockerUtils.getDocker(options)
dockerUtils.generateBuildOpts(options)
compose.generateOpts(options)
(docker, buildOpts, composeOpts) ->
deployProject(docker, logger, composeOpts, {
app
image
shouldPerformBuild
shouldUploadLogs
buildEmulated: !!options.emulated
buildOpts
})
)
.asCallback(done)

457
lib/actions/device.coffee Normal file
View File

@ -0,0 +1,457 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
commandOptions = require('./command-options')
_ = require('lodash')
{ normalizeUuidProp } = require('../utils/normalization')
expandForAppName = {
$expand: belongs_to__application: $select: 'app_name'
}
exports.list =
signature: 'devices'
description: 'list all devices'
help: '''
Use this command to list all devices that belong to you.
You can filter the devices by application by using the `--application` option.
Examples:
$ resin devices
$ resin devices --application MyApp
$ resin devices --app MyApp
$ resin devices -a MyApp
'''
options: [ commandOptions.optionalApplication ]
permission: 'user'
primary: true
action: (params, options, done) ->
Promise = require('bluebird')
resin = require('resin-sdk').fromSharedOptions()
visuals = require('resin-cli-visuals')
Promise.try ->
if options.application?
return resin.models.device.getAllByApplication(options.application, expandForAppName)
return resin.models.device.getAll(expandForAppName)
.tap (devices) ->
devices = _.map devices, (device) ->
device.dashboard_url = resin.models.device.getDashboardUrl(device.uuid)
device.application_name = device.belongs_to__application[0].app_name
device.uuid = device.uuid.slice(0, 7)
return device
console.log visuals.table.horizontal devices, [
'id'
'uuid'
'device_name'
'device_type'
'application_name'
'status'
'is_online'
'supervisor_version'
'os_version'
'dashboard_url'
]
.nodeify(done)
exports.info =
signature: 'device <uuid>'
description: 'list a single device'
help: '''
Use this command to show information about a single device.
Examples:
$ resin device 7cf02a6
'''
permission: 'user'
primary: true
action: (params, options, done) ->
normalizeUuidProp(params)
resin = require('resin-sdk').fromSharedOptions()
visuals = require('resin-cli-visuals')
resin.models.device.get(params.uuid, expandForAppName)
.then (device) ->
resin.models.device.getStatus(device).then (status) ->
device.status = status
device.dashboard_url = resin.models.device.getDashboardUrl(device.uuid)
device.application_name = device.belongs_to__application[0].app_name
device.commit = device.is_on__commit
console.log visuals.table.vertical device, [
"$#{device.device_name}$"
'id'
'device_type'
'status'
'is_online'
'ip_address'
'application_name'
'last_seen'
'uuid'
'commit'
'supervisor_version'
'is_web_accessible'
'note'
'os_version'
'dashboard_url'
]
.nodeify(done)
exports.supported =
signature: 'devices supported'
description: 'list all supported devices'
help: '''
Use this command to get the list of all supported devices
Examples:
$ resin devices supported
'''
action: (params, options, done) ->
resin = require('resin-sdk').fromSharedOptions()
visuals = require('resin-cli-visuals')
resin.models.config.getDeviceTypes().then (deviceTypes) ->
console.log visuals.table.horizontal deviceTypes, [
'slug'
'name'
]
.nodeify(done)
exports.register =
signature: 'device register <application>'
description: 'register a device'
help: '''
Use this command to register a device to an application.
Examples:
$ resin device register MyApp
$ resin device register MyApp --uuid <uuid>
'''
permission: 'user'
options: [
{
signature: 'uuid'
description: 'custom uuid'
parameter: 'uuid'
alias: 'u'
}
]
action: (params, options, done) ->
Promise = require('bluebird')
resin = require('resin-sdk').fromSharedOptions()
Promise.join(
resin.models.application.get(params.application)
options.uuid ? resin.models.device.generateUniqueKey()
(application, uuid) ->
console.info("Registering to #{application.app_name}: #{uuid}")
return resin.models.device.register(application.id, uuid)
)
.get('uuid')
.nodeify(done)
exports.remove =
signature: 'device rm <uuid>'
description: 'remove a device'
help: '''
Use this command to remove a device from resin.io.
Notice this command asks for confirmation interactively.
You can avoid this by passing the `--yes` boolean option.
Examples:
$ resin device rm 7cf02a6
$ resin device rm 7cf02a6 --yes
'''
options: [ commandOptions.yes ]
permission: 'user'
action: (params, options, done) ->
normalizeUuidProp(params)
resin = require('resin-sdk').fromSharedOptions()
patterns = require('../utils/patterns')
patterns.confirm(options.yes, 'Are you sure you want to delete the device?').then ->
resin.models.device.remove(params.uuid)
.nodeify(done)
exports.identify =
signature: 'device identify <uuid>'
description: 'identify a device with a UUID'
help: '''
Use this command to identify a device.
In the Raspberry Pi, the ACT led is blinked several times.
Examples:
$ resin device identify 23c73a1
'''
permission: 'user'
action: (params, options, done) ->
normalizeUuidProp(params)
resin = require('resin-sdk').fromSharedOptions()
resin.models.device.identify(params.uuid).nodeify(done)
exports.reboot =
signature: 'device reboot <uuid>'
description: 'restart a device'
help: '''
Use this command to remotely reboot a device
Examples:
$ resin device reboot 23c73a1
'''
options: [ commandOptions.forceUpdateLock ]
permission: 'user'
action: (params, options, done) ->
normalizeUuidProp(params)
resin = require('resin-sdk').fromSharedOptions()
resin.models.device.reboot(params.uuid, options).nodeify(done)
exports.shutdown =
signature: 'device shutdown <uuid>'
description: 'shutdown a device'
help: '''
Use this command to remotely shutdown a device
Examples:
$ resin device shutdown 23c73a1
'''
options: [ commandOptions.forceUpdateLock ]
permission: 'user'
action: (params, options, done) ->
normalizeUuidProp(params)
resin = require('resin-sdk').fromSharedOptions()
resin.models.device.shutdown(params.uuid, options).nodeify(done)
exports.enableDeviceUrl =
signature: 'device public-url enable <uuid>'
description: 'enable public URL for a device'
help: '''
Use this command to enable public URL for a device
Examples:
$ resin device public-url enable 23c73a1
'''
permission: 'user'
action: (params, options, done) ->
normalizeUuidProp(params)
resin = require('resin-sdk').fromSharedOptions()
resin.models.device.enableDeviceUrl(params.uuid).nodeify(done)
exports.disableDeviceUrl =
signature: 'device public-url disable <uuid>'
description: 'disable public URL for a device'
help: '''
Use this command to disable public URL for a device
Examples:
$ resin device public-url disable 23c73a1
'''
permission: 'user'
action: (params, options, done) ->
normalizeUuidProp(params)
resin = require('resin-sdk').fromSharedOptions()
resin.models.device.disableDeviceUrl(params.uuid).nodeify(done)
exports.getDeviceUrl =
signature: 'device public-url <uuid>'
description: 'gets the public URL of a device'
help: '''
Use this command to get the public URL of a device
Examples:
$ resin device public-url 23c73a1
'''
permission: 'user'
action: (params, options, done) ->
normalizeUuidProp(params)
resin = require('resin-sdk').fromSharedOptions()
resin.models.device.getDeviceUrl(params.uuid).then (url) ->
console.log(url)
.nodeify(done)
exports.hasDeviceUrl =
signature: 'device public-url status <uuid>'
description: 'Returns true if public URL is enabled for a device'
help: '''
Use this command to determine if public URL is enabled for a device
Examples:
$ resin device public-url status 23c73a1
'''
permission: 'user'
action: (params, options, done) ->
normalizeUuidProp(params)
resin = require('resin-sdk').fromSharedOptions()
resin.models.device.hasDeviceUrl(params.uuid).then (hasDeviceUrl) ->
console.log(hasDeviceUrl)
.nodeify(done)
exports.rename =
signature: 'device rename <uuid> [newName]'
description: 'rename a resin device'
help: '''
Use this command to rename a device.
If you omit the name, you'll get asked for it interactively.
Examples:
$ resin device rename 7cf02a6
$ resin device rename 7cf02a6 MyPi
'''
permission: 'user'
action: (params, options, done) ->
normalizeUuidProp(params)
Promise = require('bluebird')
resin = require('resin-sdk').fromSharedOptions()
form = require('resin-cli-form')
Promise.try ->
return params.newName if not _.isEmpty(params.newName)
form.ask
message: 'How do you want to name this device?'
type: 'input'
.then(_.partial(resin.models.device.rename, params.uuid))
.nodeify(done)
exports.move =
signature: 'device move <uuid>'
description: 'move a device to another application'
help: '''
Use this command to move a device to another application you own.
If you omit the application, you'll get asked for it interactively.
Examples:
$ resin device move 7cf02a6
$ resin device move 7cf02a6 --application MyNewApp
'''
permission: 'user'
options: [ commandOptions.optionalApplication ]
action: (params, options, done) ->
normalizeUuidProp(params)
resin = require('resin-sdk').fromSharedOptions()
patterns = require('../utils/patterns')
resin.models.device.get(params.uuid, expandForAppName).then (device) ->
return options.application or patterns.selectApplication (application) ->
return _.every [
application.device_type is device.device_type
device.belongs_to__application[0].app_name isnt application.app_name
]
.tap (application) ->
return resin.models.device.move(params.uuid, application)
.then (application) ->
console.info("#{params.uuid} was moved to #{application}")
.nodeify(done)
exports.init =
signature: 'device init'
description: 'initialise a device with resinOS'
help: '''
Use this command to download the OS image of a certain application and write it to an SD Card.
Notice this command may ask for confirmation interactively.
You can avoid this by passing the `--yes` boolean option.
Examples:
$ resin device init
$ resin device init --application MyApp
'''
options: [
commandOptions.optionalApplication
commandOptions.yes
commandOptions.advancedConfig
_.assign({}, commandOptions.osVersion, { signature: 'os-version', parameter: 'os-version' })
commandOptions.drive
{
signature: 'config'
description: 'path to the config JSON file, see `resin os build-config`'
parameter: 'config'
}
]
permission: 'user'
action: (params, options, done) ->
Promise = require('bluebird')
rimraf = Promise.promisify(require('rimraf'))
tmp = require('tmp')
tmpNameAsync = Promise.promisify(tmp.tmpName)
tmp.setGracefulCleanup()
resin = require('resin-sdk').fromSharedOptions()
patterns = require('../utils/patterns')
{ runCommand } = require('../utils/helpers')
Promise.try ->
return options.application if options.application?
return patterns.selectApplication()
.then(resin.models.application.get)
.then (application) ->
download = ->
tmpNameAsync().then (tempPath) ->
osVersion = options['os-version'] or 'default'
runCommand("os download #{application.device_type} --output '#{tempPath}' --version #{osVersion}")
.disposer (tempPath) ->
return rimraf(tempPath)
Promise.using download(), (tempPath) ->
runCommand("device register #{application.app_name}")
.then(resin.models.device.get)
.tap (device) ->
configureCommand = "os configure '#{tempPath}' --device #{device.uuid}"
if options.config
configureCommand += " --config '#{options.config}'"
else if options.advanced
configureCommand += ' --advanced'
runCommand(configureCommand)
.then ->
osInitCommand = "os initialize '#{tempPath}' --type #{application.device_type}"
if options.yes
osInitCommand += ' --yes'
if options.drive
osInitCommand += " --drive #{options.drive}"
runCommand(osInitCommand)
# Make sure the device resource is removed if there is an
# error when configuring or initializing a device image
.catch (error) ->
resin.models.device.remove(device.uuid).finally ->
throw error
.then (device) ->
console.log('Done')
return device.uuid
.nodeify(done)

View File

@ -0,0 +1,197 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
commandOptions = require('./command-options')
{ normalizeUuidProp } = require('../utils/normalization')
exports.list =
signature: 'envs'
description: 'list all environment variables'
help: '''
Use this command to list all environment variables for
a particular application or device.
This command lists all custom environment variables.
If you want to see all environment variables, including private
ones used by resin, use the verbose option.
At the moment the CLI doesn't fully support multi-container applications,
so the following commands will only show service variables,
without showing which service they belong to.
Example:
$ resin envs --application MyApp
$ resin envs --application MyApp --verbose
$ resin envs --device 7cf02a6
'''
options: [
commandOptions.optionalApplication
commandOptions.optionalDevice
{
signature: 'verbose'
description: 'show private environment variables'
boolean: true
alias: 'v'
}
]
permission: 'user'
action: (params, options, done) ->
normalizeUuidProp(options, 'device')
Promise = require('bluebird')
_ = require('lodash')
resin = require('resin-sdk-preconfigured')
visuals = require('resin-cli-visuals')
{ exitWithExpectedError } = require('../utils/patterns')
Promise.try ->
if options.application?
return resin.models.environmentVariables.getAllByApplication(options.application)
else if options.device?
return resin.models.environmentVariables.device.getAll(options.device)
else
exitWithExpectedError('You must specify an application or device')
.tap (environmentVariables) ->
if _.isEmpty(environmentVariables)
exitWithExpectedError('No environment variables found')
if not options.verbose
isSystemVariable = resin.models.environmentVariables.isSystemVariable
environmentVariables = _.reject(environmentVariables, isSystemVariable)
console.log visuals.table.horizontal environmentVariables, [
'id'
'name'
'value'
]
.nodeify(done)
exports.remove =
signature: 'env rm <id>'
description: 'remove an environment variable'
help: '''
Use this command to remove an environment variable from an application.
Don't remove resin specific variables, as things might not work as expected.
Notice this command asks for confirmation interactively.
You can avoid this by passing the `--yes` boolean option.
If you want to eliminate a device environment variable, pass the `--device` boolean option.
Examples:
$ resin env rm 215
$ resin env rm 215 --yes
$ resin env rm 215 --device
'''
options: [
commandOptions.yes
commandOptions.booleanDevice
]
permission: 'user'
action: (params, options, done) ->
resin = require('resin-sdk-preconfigured')
patterns = require('../utils/patterns')
patterns.confirm(options.yes, 'Are you sure you want to delete the environment variable?').then ->
if options.device
resin.models.environmentVariables.device.remove(params.id)
else
resin.models.environmentVariables.remove(params.id)
.nodeify(done)
exports.add =
signature: 'env add <key> [value]'
description: 'add an environment variable'
help: '''
Use this command to add an enviroment variable to an application.
At the moment the CLI doesn't fully support multi-container applications,
so the following commands will only set service variables for the first
service in your application.
If value is omitted, the tool will attempt to use the variable's value
as defined in your host machine.
Use the `--device` option if you want to assign the environment variable
to a specific device.
If the value is grabbed from the environment, a warning message will be printed.
Use `--quiet` to remove it.
Examples:
$ resin env add EDITOR vim --application MyApp
$ resin env add TERM --application MyApp
$ resin env add EDITOR vim --device 7cf02a6
'''
options: [
commandOptions.optionalApplication
commandOptions.optionalDevice
]
permission: 'user'
action: (params, options, done) ->
normalizeUuidProp(options, 'device')
Promise = require('bluebird')
resin = require('resin-sdk-preconfigured')
{ exitWithExpectedError } = require('../utils/patterns')
Promise.try ->
if not params.value?
params.value = process.env[params.key]
if not params.value?
throw new Error("Environment value not found for key: #{params.key}")
else
console.info("Warning: using #{params.key}=#{params.value} from host environment")
if options.application?
resin.models.environmentVariables.create(options.application, params.key, params.value)
else if options.device?
resin.models.environmentVariables.device.create(options.device, params.key, params.value)
else
exitWithExpectedError('You must specify an application or device')
.nodeify(done)
exports.rename =
signature: 'env rename <id> <value>'
description: 'rename an environment variable'
help: '''
Use this command to rename an enviroment variable from an application.
Pass the `--device` boolean option if you want to rename a device environment variable.
Examples:
$ resin env rename 376 emacs
$ resin env rename 376 emacs --device
'''
permission: 'user'
options: [ commandOptions.booleanDevice ]
action: (params, options, done) ->
Promise = require('bluebird')
resin = require('resin-sdk-preconfigured')
Promise.try ->
if options.device
resin.models.environmentVariables.device.update(params.id, params.value)
else
resin.models.environmentVariables.update(params.id, params.value)
.nodeify(done)

119
lib/actions/help.coffee Normal file
View File

@ -0,0 +1,119 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
_ = require('lodash')
capitano = require('capitano')
columnify = require('columnify')
messages = require('../utils/messages')
{ exitWithExpectedError } = require('../utils/patterns')
parse = (object) ->
return _.fromPairs _.map object, (item) ->
# Hacky way to determine if an object is
# a function or a command
if item.alias?
signature = item.toString()
else
signature = item.signature.toString()
return [
signature
item.description
]
indent = (text) ->
text = _.map text.split('\n'), (line) ->
return ' ' + line
return text.join('\n')
print = (data) ->
console.log indent columnify data,
showHeaders: false
minWidth: 35
general = (params, options, done) ->
console.log('Usage: resin [COMMAND] [OPTIONS]\n')
console.log(messages.reachingOut)
console.log('\nPrimary commands:\n')
# We do not want the wildcard command
# to be printed in the help screen.
commands = _.reject capitano.state.commands, (command) ->
return command.hidden or command.isWildcard()
groupedCommands = _.groupBy commands, (command) ->
if command.primary
return 'primary'
return 'secondary'
print(parse(groupedCommands.primary))
if options.verbose
console.log('\nAdditional commands:\n')
print(parse(groupedCommands.secondary))
else
console.log('\nRun `resin help --verbose` to list additional commands')
if not _.isEmpty(capitano.state.globalOptions)
console.log('\nGlobal Options:\n')
print(parse(capitano.state.globalOptions))
return done()
command = (params, options, done) ->
capitano.state.getMatchCommand params.command, (error, command) ->
return done(error) if error?
if not command? or command.isWildcard()
exitWithExpectedError("Command not found: #{params.command}")
console.log("Usage: #{command.signature}")
if command.help?
console.log("\n#{command.help}")
else if command.description?
console.log("\n#{_.capitalize(command.description)}")
if not _.isEmpty(command.options)
console.log('\nOptions:\n')
print(parse(command.options))
return done()
exports.help =
signature: 'help [command...]'
description: 'show help'
help: '''
Get detailed help for an specific command.
Examples:
$ resin help apps
$ resin help os download
'''
primary: true
options: [
signature: 'verbose'
description: 'show additional commands'
boolean: true
alias: 'v'
]
action: (params, options, done) ->
if params.command?
command(params, options, done)
else
general(params, options, done)

42
lib/actions/index.coffee Normal file
View File

@ -0,0 +1,42 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
module.exports =
wizard: require('./wizard')
apiKey: require('./api-key')
app: require('./app')
auth: require('./auth')
info: require('./info')
device: require('./device')
env: require('./environment-variables')
keys: require('./keys')
logs: require('./logs')
local: require('./local')
notes: require('./notes')
help: require('./help')
os: require('./os')
settings: require('./settings')
config: require('./config')
sync: require('./sync')
ssh: require('./ssh')
internal: require('./internal')
build: require('./build')
deploy: require('./deploy')
util: require('./util')
preload: require('./preload')
push: require('./push')
join: require('./join')
leave: require('./leave')

30
lib/actions/info.ts Normal file
View File

@ -0,0 +1,30 @@
/*
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import { CommandDefinition } from 'capitano';
export const version: CommandDefinition = {
signature: 'version',
description: 'output the version number',
help: `\
Display the Resin CLI version.\
`,
async action(_params, _options, done) {
const packageJSON = await import('../../package.json');
console.log(packageJSON.version);
return done();
},
};

View File

@ -0,0 +1,87 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
# These are internal commands we want to be runnable from the outside
# One use-case for this is spawning the minimal operation with root priviledges
exports.osInit =
signature: 'internal osinit <image> <type> <config>'
description: 'do actual init of the device with the preconfigured os image'
help: '''
Don't use this command directly! Use `resin os initialize <image>` instead.
'''
hidden: true
root: true
action: (params, options, done) ->
Promise = require('bluebird')
init = require('resin-device-init')
helpers = require('../utils/helpers')
return Promise.try ->
config = JSON.parse(params.config)
init.initialize(params.image, params.type, config)
.then(helpers.osProgressHandler)
.nodeify(done)
exports.scanDevices =
signature: 'internal scandevices'
description: 'scan for local resin-enabled devices and show a picker to choose one'
help: '''
Don't use this command directly!
'''
hidden: true
root: true
action: (params, options, done) ->
Promise = require('bluebird')
{ forms } = require('resin-sync')
return Promise.try ->
forms.selectLocalResinOsDevice()
.then (hostnameOrIp) ->
console.error("==> Selected device: #{hostnameOrIp}")
.nodeify(done)
exports.sudo =
signature: 'internal sudo <command>'
description: 'execute arbitrary commands in a privileged subprocess'
help: '''
Don't use this command directly!
<command> must be passed as a single argument. That means, you need to make sure
you enclose <command> in quotes (eg. resin internal sudo 'ls -alF') if for
whatever reason you invoke the command directly or, typically, pass <command>
as a single argument to spawn (eg. `spawn('resin', [ 'internal', 'sudo', 'ls -alF' ])`).
Furthermore, this command will naively split <command> on whitespace and directly
forward the parts as arguments to `sudo`, so be careful.
'''
hidden: true
action: (params, options, done) ->
os = require('os')
Promise = require('bluebird')
return Promise.try ->
if os.platform() is 'win32'
windosu = require('windosu')
windosu.exec(params.command, {})
else
{ spawn } = require('child_process')
{ wait } = require('rindle')
cmd = params.command.split(' ')
ps = spawn('sudo', cmd, stdio: 'inherit', env: process.env)
wait(ps)
.nodeify(done)

62
lib/actions/join.ts Normal file
View File

@ -0,0 +1,62 @@
/*
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import * as Bluebird from 'bluebird';
import { CommandDefinition } from 'capitano';
import { stripIndent } from 'common-tags';
interface Args {
deviceIp?: string;
}
interface Options {
application?: string;
}
export const join: CommandDefinition<Args, Options> = {
signature: 'join [deviceIp]',
description:
'Promote a local device running unmanaged resinOS to join a resin.io application',
help: stripIndent`
Examples:
$ resin join
$ resin join resin.local
$ resin join resin.local --application MyApp
$ resin join 192.168.1.25
$ resin join 192.168.1.25 --application MyApp
`,
options: [
{
signature: 'application',
parameter: 'application',
alias: 'a',
description: 'The name of the application the device should join',
},
],
primary: true,
async action(params, options, done) {
const resin = await import('resin-sdk');
const Logger = await import('../utils/logger');
const promote = await import('../utils/promote');
const sdk = resin.fromSharedOptions();
const logger = new Logger();
return Bluebird.try(() => {
return promote.join(logger, sdk, params.deviceIp, options.application);
}).nodeify(done);
},
};

123
lib/actions/keys.coffee Normal file
View File

@ -0,0 +1,123 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
commandOptions = require('./command-options')
exports.list =
signature: 'keys'
description: 'list all ssh keys'
help: '''
Use this command to list all your SSH keys.
Examples:
$ resin keys
'''
permission: 'user'
action: (params, options, done) ->
resin = require('resin-sdk').fromSharedOptions()
visuals = require('resin-cli-visuals')
resin.models.key.getAll().then (keys) ->
console.log visuals.table.horizontal keys, [
'id'
'title'
]
.nodeify(done)
exports.info =
signature: 'key <id>'
description: 'list a single ssh key'
help: '''
Use this command to show information about a single SSH key.
Examples:
$ resin key 17
'''
permission: 'user'
action: (params, options, done) ->
resin = require('resin-sdk').fromSharedOptions()
visuals = require('resin-cli-visuals')
resin.models.key.get(params.id).then (key) ->
console.log visuals.table.vertical key, [
'id'
'title'
]
# Since the public key string is long, it might
# wrap to lines below, causing the table layout to break.
# See https://github.com/resin-io/resin-cli/issues/151
console.log('\n' + key.public_key)
.nodeify(done)
exports.remove =
signature: 'key rm <id>'
description: 'remove a ssh key'
help: '''
Use this command to remove a SSH key from resin.io.
Notice this command asks for confirmation interactively.
You can avoid this by passing the `--yes` boolean option.
Examples:
$ resin key rm 17
$ resin key rm 17 --yes
'''
options: [ commandOptions.yes ]
permission: 'user'
action: (params, options, done) ->
resin = require('resin-sdk').fromSharedOptions()
patterns = require('../utils/patterns')
patterns.confirm(options.yes, 'Are you sure you want to delete the key?').then ->
resin.models.key.remove(params.id)
.nodeify(done)
exports.add =
signature: 'key add <name> [path]'
description: 'add a SSH key to resin.io'
help: '''
Use this command to associate a new SSH key with your account.
If `path` is omitted, the command will attempt
to read the SSH key from stdin.
Examples:
$ resin key add Main ~/.ssh/id_rsa.pub
$ cat ~/.ssh/id_rsa.pub | resin key add Main
'''
permission: 'user'
action: (params, options, done) ->
_ = require('lodash')
Promise = require('bluebird')
readFileAsync = Promise.promisify(require('fs').readFile)
capitano = require('capitano')
resin = require('resin-sdk').fromSharedOptions()
Promise.try ->
return readFileAsync(params.path, encoding: 'utf8') if params.path?
# TODO: should this be promisified for consistency?
Promise.fromNode (callback) ->
capitano.utils.getStdin (data) ->
return callback(null, data)
.then(_.partial(resin.models.key.create, params.name))
.nodeify(done)

49
lib/actions/leave.ts Normal file
View File

@ -0,0 +1,49 @@
/*
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import * as Bluebird from 'bluebird';
import { CommandDefinition } from 'capitano';
import { stripIndent } from 'common-tags';
interface Args {
deviceIp?: string;
}
export const leave: CommandDefinition<Args, {}> = {
signature: 'leave [deviceIp]',
description: 'Detach a local device from its resin.io application',
help: stripIndent`
Examples:
$ resin leave
$ resin leave resin.local
$ resin leave 192.168.1.25
`,
options: [],
permission: 'user',
primary: true,
async action(params, _options, done) {
const resin = await import('resin-sdk');
const Logger = await import('../utils/logger');
const promote = await import('../utils/promote');
const sdk = resin.fromSharedOptions();
const logger = new Logger();
return Bluebird.try(() => {
return promote.leave(logger, sdk, params.deviceIp);
}).nodeify(done);
},
};

View File

@ -0,0 +1,62 @@
Promise = require('bluebird')
_ = require('lodash')
form = require('resin-cli-form')
chalk = require('chalk')
dockerUtils = require('../../utils/docker')
{ exitWithExpectedError } = require('../../utils/patterns')
exports.dockerPort = dockerPort = 2375
exports.dockerTimeout = dockerTimeout = 2000
exports.filterOutSupervisorContainer = filterOutSupervisorContainer = (container) ->
for name in container.Names
return false if name.includes('resin_supervisor')
return true
exports.selectContainerFromDevice = Promise.method (deviceIp, filterSupervisor = false) ->
docker = dockerUtils.createClient(host: deviceIp, port: dockerPort, timeout: dockerTimeout)
# List all containers, including those not running
docker.listContainersAsync(all: true)
.filter (container) ->
return true if not filterSupervisor
filterOutSupervisorContainer(container)
.then (containers) ->
if _.isEmpty(containers)
exitWithExpectedError("No containers found in #{deviceIp}")
return form.ask
message: 'Select a container'
type: 'list'
choices: _.map containers, (container) ->
containerName = container.Names?[0] or 'Untitled'
shortContainerId = ('' + container.Id).substr(0, 11)
return {
name: "#{containerName} (#{shortContainerId})"
value: container.Id
}
exports.pipeContainerStream = Promise.method ({ deviceIp, name, outStream, follow = false }) ->
docker = dockerUtils.createClient(host: deviceIp, port: dockerPort)
container = docker.getContainer(name)
container.inspectAsync()
.then (containerInfo) ->
return containerInfo?.State?.Running
.then (isRunning) ->
container.attachAsync
logs: not follow or not isRunning
stream: follow and isRunning
stdout: true
stderr: true
.then (containerStream) ->
containerStream.pipe(outStream)
.catch (err) ->
err = '' + err.statusCode
if err is '404'
return console.log(chalk.red.bold("Container '#{name}' not found."))
throw err
exports.getSubShellCommand = require('../../utils/helpers').getSubShellCommand

View File

@ -0,0 +1,230 @@
###
Copyright 2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
BOOT_PARTITION = 1
CONNECTIONS_FOLDER = '/system-connections'
getConfigurationSchema = (connnectionFileName = 'resin-wifi') ->
mapper: [
{
template:
persistentLogging: '{{persistentLogging}}'
domain: [
[ 'config_json', 'persistentLogging' ]
]
}
{
template:
hostname: '{{hostname}}'
domain: [
[ 'config_json', 'hostname' ]
]
}
{
template:
wifi:
ssid: '{{networkSsid}}'
'wifi-security':
psk: '{{networkKey}}'
domain: [
[ 'system_connections', connnectionFileName, 'wifi' ]
[ 'system_connections', connnectionFileName, 'wifi-security' ]
]
}
]
files:
system_connections:
fileset: true
type: 'ini'
location:
path: CONNECTIONS_FOLDER.slice(1)
# Reconfix still uses the older resin-image-fs, so still needs an
# object-based partition definition.
partition: BOOT_PARTITION
config_json:
type: 'json'
location:
path: 'config.json'
partition: BOOT_PARTITION
inquirerOptions = (data) -> [
{
message: 'Network SSID'
type: 'input'
name: 'networkSsid'
default: data.networkSsid
}
{
message: 'Network Key'
type: 'input'
name: 'networkKey'
default: data.networkKey
}
{
message: 'Do you want to set advanced settings?'
type: 'confirm'
name: 'advancedSettings'
default: false
}
{
message: 'Device Hostname'
type: 'input'
name: 'hostname'
default: data.hostname,
when: (answers) ->
answers.advancedSettings
}
{
message: 'Do you want to enable persistent logging?'
type: 'confirm'
name: 'persistentLogging'
default: data.persistentLogging
when: (answers) ->
answers.advancedSettings
}
]
getConfiguration = (data) ->
_ = require('lodash')
inquirer = require('inquirer')
# `persistentLogging` can be `undefined`, so we want
# to make sure that case defaults to `false`
data = _.assign data,
persistentLogging: data.persistentLogging or false
inquirer.prompt(inquirerOptions(data))
.then (answers) ->
return _.merge(data, answers)
# Taken from https://goo.gl/kr1kCt
CONNECTION_FILE = '''
[connection]
id=resin-wifi
type=wifi
[wifi]
hidden=true
mode=infrastructure
ssid=My_Wifi_Ssid
[wifi-security]
auth-alg=open
key-mgmt=wpa-psk
psk=super_secret_wifi_password
[ipv4]
method=auto
[ipv6]
addr-gen-mode=stable-privacy
method=auto
'''
###
* if the `resin-wifi` file exists (previously configured image or downloaded from the UI) it's used and reconfigured
* if the `resin-sample.ignore` exists it's copied to `resin-wifi`
* if the `resin-sample` exists it's reconfigured (legacy mode, will be removed eventually)
* otherwise, the new file is created
###
prepareConnectionFile = (target) ->
_ = require('lodash')
imagefs = require('resin-image-fs')
imagefs.listDirectory
image: target
partition: BOOT_PARTITION
path: CONNECTIONS_FOLDER
.then (files) ->
# The required file already exists
if _.includes(files, 'resin-wifi')
return null
# Fresh image, new mode, accoding to https://github.com/resin-os/meta-resin/pull/770/files
if _.includes(files, 'resin-sample.ignore')
return imagefs.copy
image: target
partition: BOOT_PARTITION
path: "#{CONNECTIONS_FOLDER}/resin-sample.ignore"
,
image: target
partition: BOOT_PARTITION
path: "#{CONNECTIONS_FOLDER}/resin-wifi"
.thenReturn(null)
# Legacy mode, to be removed later
# We return the file name override from this branch
# When it is removed the following cleanup should be done:
# * delete all the null returns from this method
# * turn `getConfigurationSchema` back into the constant, with the connection filename always being `resin-wifi`
# * drop the final `then` from this method
# * adapt the code in the main listener to not receive the config from this method, and use that constant instead
if _.includes(files, 'resin-sample')
return 'resin-sample'
# In case there's no file at all (shouldn't happen normally, but the file might have been removed)
return imagefs.writeFile
image: target
partition: BOOT_PARTITION
path: "#{CONNECTIONS_FOLDER}/resin-wifi"
, CONNECTION_FILE
.thenReturn(null)
.then (connectionFileName) ->
return getConfigurationSchema(connectionFileName)
removeHostname = (schema) ->
_ = require('lodash')
schema.mapper = _.reject schema.mapper, (mapper) ->
_.isEqual(Object.keys(mapper.template), ['hostname'])
module.exports =
signature: 'local configure <target>'
description: '(Re)configure a resinOS drive or image'
help: '''
Use this command to configure or reconfigure a resinOS drive or image.
Examples:
$ resin local configure /dev/sdc
$ resin local configure path/to/image.img
'''
root: true
action: (params, options, done) ->
Promise = require('bluebird')
umount = require('umount')
umountAsync = Promise.promisify(umount.umount)
isMountedAsync = Promise.promisify(umount.isMounted)
reconfix = require('reconfix')
denymount = Promise.promisify(require('denymount'))
prepareConnectionFile(params.target)
.tap ->
isMountedAsync(params.target).then (isMounted) ->
return if not isMounted
umountAsync(params.target)
.then (configurationSchema) ->
denymount params.target, (cb) ->
reconfix.readConfiguration(configurationSchema, params.target)
.then(getConfiguration)
.then (answers) ->
if not answers.hostname
removeHostname(configurationSchema)
reconfix.writeConfiguration(configurationSchema, answers, params.target)
.asCallback(cb)
.then ->
console.log('Done!')
.asCallback(done)

View File

@ -0,0 +1,120 @@
###
Copyright 2017 Resin.io
Licensed under the Apache License, Version 2.0 (the 'License');
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an 'AS IS' BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
module.exports =
signature: 'local flash <image>'
description: 'Flash an image to a drive'
help: '''
Use this command to flash a resinOS image to a drive.
Examples:
$ resin local flash path/to/resinos.img
$ resin local flash path/to/resinos.img --drive /dev/disk2
$ resin local flash path/to/resinos.img --drive /dev/disk2 --yes
'''
options: [
signature: 'yes'
boolean: true
description: 'confirm non-interactively'
alias: 'y'
,
signature: 'drive'
parameter: 'drive'
description: 'drive'
alias: 'd'
]
root: true
action: (params, options, done) ->
_ = require('lodash')
os = require('os')
Promise = require('bluebird')
umountAsync = Promise.promisify(require('umount').umount)
fs = Promise.promisifyAll(require('fs'))
driveListAsync = Promise.promisify(require('drivelist').list)
chalk = require('chalk')
visuals = require('resin-cli-visuals')
form = require('resin-cli-form')
imageWrite = require('etcher-image-write')
form.run [
{
message: 'Select drive'
type: 'drive'
name: 'drive'
},
{
message: 'This will erase the selected drive. Are you sure?'
type: 'confirm'
name: 'yes'
default: false
}
],
override:
drive: options.drive
# If `options.yes` is `false`, pass `undefined`,
# otherwise the question will not be asked because
# `false` is a defined value.
yes: options.yes || undefined
# TODO: dedupe with the resin-device-operations
.then (answers) ->
if answers.yes isnt true
console.log(chalk.red.bold('Aborted image flash'))
process.exit(0)
driveListAsync().then (drives) ->
selectedDrive = _.find(drives, device: answers.drive)
if not selectedDrive?
throw new Error("Drive not found: #{answers.drive}")
return selectedDrive
.then (selectedDrive) ->
progressBars =
write: new visuals.Progress('Flashing')
check: new visuals.Progress('Validating')
umountAsync(selectedDrive.device).then ->
Promise.props
imageSize: fs.statAsync(params.image).get('size'),
imageStream: Promise.resolve(fs.createReadStream(params.image))
driveFileDescriptor: fs.openAsync(selectedDrive.raw, 'rs+')
.then (results) ->
imageWrite.write
fd: results.driveFileDescriptor
device: selectedDrive.raw
size: selectedDrive.size
,
stream: results.imageStream,
size: results.imageSize
,
check: true
.then (writer) ->
new Promise (resolve, reject) ->
writer.on 'progress', (state) ->
progressBars[state.type].update(state)
writer.on('error', reject)
writer.on('done', resolve)
.then ->
if (os.platform() is 'win32') and selectedDrive.mountpoint?
ejectAsync = Promise.promisify(require('removedrive').eject)
return ejectAsync(selectedDrive.mountpoint)
return umountAsync(selectedDrive.device)
.asCallback(done)

View File

@ -1,5 +1,5 @@
/*
Copyright 2016-2020 Balena
###
Copyright 2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
@ -12,14 +12,12 @@ distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
###
import type { OptionalNavigationResource } from 'balena-sdk';
export const getExpanded = <T>(obj: OptionalNavigationResource<T>) =>
(Array.isArray(obj) && obj[0]) || undefined;
export const getExpandedProp = <T, K extends keyof T>(
obj: OptionalNavigationResource<T>,
key: K,
) => (Array.isArray(obj) && obj[0] && obj[0][key]) || undefined;
exports.configure = require('./configure')
exports.flash = require('./flash')
exports.logs = require('./logs')
exports.scan = require('./scan')
exports.ssh = require('./ssh')
exports.push = require('./push')
exports.stop = require('./stop')

View File

@ -0,0 +1,66 @@
###
Copyright 2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
# A function to reliably execute a command
# in all supported operating systems, including
# different Windows environments like `cmd.exe`
# and `Cygwin` should be encapsulated in a
# re-usable package.
#
module.exports =
signature: 'local logs [deviceIp]'
description: 'Get or attach to logs of a running container on a resinOS device'
help: '''
Examples:
$ resin local logs
$ resin local logs -f
$ resin local logs 192.168.1.10
$ resin local logs 192.168.1.10 -f
$ resin local logs 192.168.1.10 -f --app-name myapp
'''
options: [
signature: 'follow'
boolean: true
description: 'follow log'
alias: 'f'
,
signature: 'app-name'
parameter: 'name'
description: 'name of container to get logs from'
alias: 'a'
]
root: true
action: (params, options, done) ->
Promise = require('bluebird')
{ forms } = require('resin-sync')
{ selectContainerFromDevice, pipeContainerStream } = require('./common')
Promise.try ->
if not params.deviceIp?
return forms.selectLocalResinOsDevice()
return params.deviceIp
.then (@deviceIp) =>
if not options['app-name']?
return selectContainerFromDevice(@deviceIp)
return options['app-name']
.then (appName) =>
pipeContainerStream
deviceIp: @deviceIp
name: appName
outStream: process.stdout
follow: options['follow']

View File

@ -0,0 +1,78 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
# Loads '.resin-sync.yml' configuration from 'source' directory.
# Returns the configuration object on success
#
_ = require('lodash')
resinPush = require('resin-sync').capitano('resin-toolbox')
# TODO: This is a temporary workaround to reuse the existing `rdt push`
# capitano frontend in `resin local push`.
resinPushHelp = '''
Warning: 'resin local push' requires an openssh-compatible client and 'rsync' to
be correctly installed in your shell environment. For more information (including
Windows support) please check the README here: https://github.com/resin-io/resin-cli
Use this command to push your local changes to a container on a LAN-accessible resinOS device on the fly.
If `Dockerfile` or any file in the 'build-triggers' list is changed,
a new container will be built and run on your device.
If not, changes will simply be synced with `rsync` into the application container.
After every 'resin local push' the updated settings will be saved in
'<source>/.resin-sync.yml' and will be used in later invocations. You can
also change any option by editing '.resin-sync.yml' directly.
Here is an example '.resin-sync.yml' :
$ cat $PWD/.resin-sync.yml
local_resinos:
app-name: local-app
build-triggers:
- Dockerfile: file-hash-abcdefabcdefabcdefabcdefabcdefabcdef
- package.json: file-hash-abcdefabcdefabcdefabcdefabcdefabcdef
environment:
- MY_VARIABLE=123
Command line options have precedence over the ones saved in '.resin-sync.yml'.
If '.gitignore' is found in the source directory then all explicitly listed files will be
excluded when using rsync to update the container. You can choose to change this default behavior with the
'--skip-gitignore' option.
Examples:
$ resin local push
$ resin local push --app-name test-server --build-triggers package.json,requirements.txt
$ resin local push --force-build
$ resin local push --force-build --skip-logs
$ resin local push --ignore lib/
$ resin local push --verbose false
$ resin local push 192.168.2.10 --source . --destination /usr/src/app
$ resin local push 192.168.2.10 -s /home/user/myResinProject -d /usr/src/app --before 'echo Hello' --after 'echo Done'
'''
module.exports = _.assign resinPush,
signature: 'local push [deviceIp]'
help: resinPushHelp
primary: true
root: true

View File

@ -0,0 +1,100 @@
###
Copyright 2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
dockerInfoProperties = [
'Containers'
'ContainersRunning'
'ContainersPaused'
'ContainersStopped'
'Images'
'Driver'
'SystemTime'
'KernelVersion'
'OperatingSystem'
'Architecture'
]
dockerVersionProperties = [
'Version'
'ApiVersion'
]
module.exports =
signature: 'local scan'
description: 'Scan for resinOS devices in your local network'
help: '''
Examples:
$ resin local scan
$ resin local scan --timeout 120
$ resin local scan --verbose
'''
options: [
signature: 'verbose'
boolean: true
description: 'Display full info'
alias: 'v'
,
signature: 'timeout'
parameter: 'timeout'
description: 'Scan timeout in seconds'
alias: 't'
]
primary: true
root: true
action: (params, options, done) ->
Promise = require('bluebird')
_ = require('lodash')
prettyjson = require('prettyjson')
{ discover } = require('resin-sync')
{ SpinnerPromise } = require('resin-cli-visuals')
{ dockerPort, dockerTimeout } = require('./common')
dockerUtils = require('../../utils/docker')
{ exitWithExpectedError } = require('../../utils/patterns')
if options.timeout?
options.timeout *= 1000
Promise.try ->
new SpinnerPromise
promise: discover.discoverLocalResinOsDevices(options.timeout)
startMessage: 'Scanning for local resinOS devices..'
stopMessage: 'Reporting scan results'
.filter ({ address }) ->
Promise.try ->
docker = dockerUtils.createClient(host: address, port: dockerPort, timeout: dockerTimeout)
docker.pingAsync()
.return(true)
.catchReturn(false)
.tap (devices) ->
if _.isEmpty(devices)
exitWithExpectedError('Could not find any resinOS devices in the local network')
.map ({ host, address }) ->
docker = dockerUtils.createClient(host: address, port: dockerPort, timeout: dockerTimeout)
Promise.props
dockerInfo: docker.infoAsync().catchReturn('Could not get Docker info')
dockerVersion: docker.versionAsync().catchReturn('Could not get Docker version')
.then ({ dockerInfo, dockerVersion }) ->
if not options.verbose
dockerInfo = _.pick(dockerInfo, dockerInfoProperties) if _.isObject(dockerInfo)
dockerVersion = _.pick(dockerVersion, dockerVersionProperties) if _.isObject(dockerVersion)
return { host, address, dockerInfo, dockerVersion }
.then (devicesInfo) ->
console.log(prettyjson.render(devicesInfo, noColor: true))
.nodeify(done)

View File

@ -0,0 +1,114 @@
###
Copyright 2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
{ hostOSAccess } = require('../command-options')
_ = require('lodash')
localHostOSAccessOption = _.cloneDeep(hostOSAccess)
localHostOSAccessOption.description = 'get a shell into the host OS'
module.exports =
signature: 'local ssh [deviceIp]'
description: 'Get a shell into a resinOS device'
help: '''
Warning: 'resin local ssh' requires an openssh-compatible client to be correctly
installed in your shell environment. For more information (including Windows
support) please check the README here: https://github.com/resin-io/resin-cli
Use this command to get a shell into the running application container of
your device.
The '--host' option will get you a shell into the Host OS of the resinOS device.
No option will return a list of containers to enter or you can explicitly select
one by passing its name to the --container option
Examples:
$ resin local ssh
$ resin local ssh --host
$ resin local ssh --container chaotic_water
$ resin local ssh --container chaotic_water --port 22222
$ resin local ssh --verbose
'''
options: [
signature: 'verbose'
boolean: true
description: 'increase verbosity'
alias: 'v'
,
localHostOSAccessOption,
signature: 'container'
parameter: 'container'
default: null
description: 'name of container to access'
alias: 'c'
,
signature: 'port'
parameter: 'port'
description: 'ssh port number (default: 22222)'
alias: 'p'
]
root: true
action: (params, options, done) ->
child_process = require('child_process')
Promise = require 'bluebird'
_ = require('lodash')
{ forms } = require('resin-sync')
{ selectContainerFromDevice, getSubShellCommand } = require('./common')
{ exitWithExpectedError } = require('../../utils/patterns')
if (options.host is true and options.container?)
exitWithExpectedError('Please pass either --host or --container option')
if not options.port?
options.port = 22222
verbose = if options.verbose then '-vvv' else ''
Promise.try ->
if not params.deviceIp?
return forms.selectLocalResinOsDevice()
return params.deviceIp
.then (deviceIp) ->
_.assign(options, { deviceIp })
return if options.host
if not options.container?
return selectContainerFromDevice(deviceIp)
return options.container
.then (container) ->
command = "ssh \
#{verbose} \
-t \
-p #{options.port} \
-o LogLevel=ERROR \
-o StrictHostKeyChecking=no \
-o UserKnownHostsFile=/dev/null \
root@#{options.deviceIp}"
if not options.host
shellCmd = '''/bin/sh -c $"'if [ -e /bin/bash ]; then exec /bin/bash; else exec /bin/sh; fi'"'''
dockerCmd = "'$(if [ -f /usr/bin/balena ]; then echo \"balena\"; else echo \"docker\"; fi)'"
command += " #{dockerCmd} exec -ti #{container} #{shellCmd}"
subShellCommand = getSubShellCommand(command)
child_process.spawn subShellCommand.program, subShellCommand.args,
stdio: 'inherit'
.nodeify(done)

View File

@ -0,0 +1,79 @@
###
Copyright 2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
# A function to reliably execute a command
# in all supported operating systems, including
# different Windows environments like `cmd.exe`
# and `Cygwin` should be encapsulated in a
# re-usable package.
#
module.exports =
signature: 'local stop [deviceIp]'
description: 'Stop a running container on a resinOS device'
help: '''
Examples:
$ resin local stop
$ resin local stop --app-name myapp
$ resin local stop --all
$ resin local stop 192.168.1.10
$ resin local stop 192.168.1.10 --app-name myapp
'''
options: [
signature: 'all'
boolean: true
description: 'stop all containers'
,
signature: 'app-name'
parameter: 'name'
description: 'name of container to stop'
alias: 'a'
]
root: true
action: (params, options, done) ->
Promise = require('bluebird')
chalk = require('chalk')
{ forms, config, ResinLocalDockerUtils } = require('resin-sync')
{ selectContainerFromDevice, filterOutSupervisorContainer } = require('./common')
Promise.try ->
if not params.deviceIp?
return forms.selectLocalResinOsDevice()
return params.deviceIp
.then (@deviceIp) =>
@docker = new ResinLocalDockerUtils(@deviceIp)
if options.all
# Only list running containers
return @docker.docker.listContainersAsync(all: false)
.filter(filterOutSupervisorContainer)
.then (containers) =>
Promise.map containers, ({ Names, Id }) =>
console.log(chalk.yellow.bold("* Stopping container #{Names[0]}"))
@docker.stopContainer(Id)
ymlConfig = config.load()
@appName = options['app-name'] ? ymlConfig['local_resinos']?['app-name']
@docker.checkForRunningContainer(@appName)
.then (isRunning) =>
if not isRunning
return selectContainerFromDevice(@deviceIp, true)
console.log(chalk.yellow.bold("* Stopping container #{@appName}"))
return @appName
.then (runningContainerName) =>
@docker.stopContainer(runningContainerName)

61
lib/actions/logs.coffee Normal file
View File

@ -0,0 +1,61 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
{ normalizeUuidProp } = require('../utils/normalization')
module.exports =
signature: 'logs <uuid>'
description: 'show device logs'
help: '''
Use this command to show logs for a specific device.
By default, the command prints all log messages and exit.
To continuously stream output, and see new logs in real time, use the `--tail` option.
Examples:
$ resin logs 23c73a1
$ resin logs 23c73a1
'''
options: [
{
signature: 'tail'
description: 'continuously stream output'
boolean: true
alias: 't'
}
]
permission: 'user'
primary: true
action: (params, options, done) ->
normalizeUuidProp(params)
resin = require('resin-sdk').fromSharedOptions()
moment = require('moment')
printLine = (line) ->
timestamp = moment(line.timestamp).format('DD.MM.YY HH:mm:ss (ZZ)')
console.log("#{timestamp} #{line.message}")
if options.tail
resin.logs.subscribe(params.uuid, { count: 100 }).then (logs) ->
logs.on('line', printLine)
logs.on('error', done)
.catch(done)
else
resin.logs.history(params.uuid)
.each(printLine)
.catch(done)

55
lib/actions/notes.coffee Normal file
View File

@ -0,0 +1,55 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
{ normalizeUuidProp } = require('../utils/normalization')
exports.set =
signature: 'note <|note>'
description: 'set a device note'
help: '''
Use this command to set or update a device note.
If note command isn't passed, the tool attempts to read from `stdin`.
To view the notes, use $ resin device <uuid>.
Examples:
$ resin note "My useful note" --device 7cf02a6
$ cat note.txt | resin note --device 7cf02a6
'''
options: [
signature: 'device'
parameter: 'device'
description: 'device uuid'
alias: [ 'd', 'dev' ]
required: 'You have to specify a device'
]
permission: 'user'
action: (params, options, done) ->
normalizeUuidProp(options, 'device')
Promise = require('bluebird')
_ = require('lodash')
resin = require('resin-sdk').fromSharedOptions()
{ exitWithExpectedError } = require('../utils/patterns')
Promise.try ->
if _.isEmpty(params.note)
exitWithExpectedError('Missing note content')
resin.models.device.note(options.device, params.note)
.nodeify(done)

382
lib/actions/os.coffee Normal file
View File

@ -0,0 +1,382 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
commandOptions = require('./command-options')
_ = require('lodash')
{ normalizeUuidProp } = require('../utils/normalization')
formatVersion = (v, isRecommended) ->
result = "v#{v}"
if isRecommended
result += ' (recommended)'
return result
resolveVersion = (deviceType, version) ->
if version isnt 'menu'
if version[0] == 'v'
version = version.slice(1)
return Promise.resolve(version)
form = require('resin-cli-form')
resin = require('resin-sdk').fromSharedOptions()
resin.models.os.getSupportedVersions(deviceType)
.then ({ versions, recommended }) ->
choices = versions.map (v) ->
value: v
name: formatVersion(v, v is recommended)
return form.ask
message: 'Select the OS version:'
type: 'list'
choices: choices
default: recommended
exports.versions =
signature: 'os versions <type>'
description: 'show the available resinOS versions for the given device type'
help: '''
Use this command to show the available resinOS versions for a certain device type.
Check available types with `resin devices supported`
Example:
$ resin os versions raspberrypi3
'''
action: (params, options, done) ->
resin = require('resin-sdk').fromSharedOptions()
resin.models.os.getSupportedVersions(params.type)
.then ({ versions, recommended }) ->
versions.forEach (v) ->
console.log(formatVersion(v, v is recommended))
exports.download =
signature: 'os download <type>'
description: 'download an unconfigured os image'
help: '''
Use this command to download an unconfigured os image for a certain device type.
Check available types with `resin devices supported`
If version is not specified the newest stable (non-pre-release) version of OS
is downloaded if available, or the newest version otherwise (if all existing
versions for the given device type are pre-release).
You can pass `--version menu` to pick the OS version from the interactive menu
of all available versions.
Examples:
$ resin os download raspberrypi3 -o ../foo/bar/raspberry-pi.img
$ resin os download raspberrypi3 -o ../foo/bar/raspberry-pi.img --version 1.24.1
$ resin os download raspberrypi3 -o ../foo/bar/raspberry-pi.img --version ^1.20.0
$ resin os download raspberrypi3 -o ../foo/bar/raspberry-pi.img --version latest
$ resin os download raspberrypi3 -o ../foo/bar/raspberry-pi.img --version default
$ resin os download raspberrypi3 -o ../foo/bar/raspberry-pi.img --version menu
'''
permission: 'user'
options: [
{
signature: 'output'
description: 'output path'
parameter: 'output'
alias: 'o'
required: 'You have to specify the output location'
}
commandOptions.osVersion
]
action: (params, options, done) ->
Promise = require('bluebird')
unzip = require('unzip2')
fs = require('fs')
rindle = require('rindle')
manager = require('resin-image-manager')
visuals = require('resin-cli-visuals')
console.info("Getting device operating system for #{params.type}")
displayVersion = ''
Promise.try ->
if not options.version
console.warn('OS version is not specified, using the default version:
the newest stable (non-pre-release) version if available,
or the newest version otherwise (if all existing
versions for the given device type are pre-release).')
return 'default'
return resolveVersion(params.type, options.version)
.then (version) ->
if version isnt 'default'
displayVersion = " #{version}"
return manager.get(params.type, version)
.then (stream) ->
bar = new visuals.Progress("Downloading Device OS#{displayVersion}")
spinner = new visuals.Spinner("Downloading Device OS#{displayVersion} (size unknown)")
stream.on 'progress', (state) ->
if state?
bar.update(state)
else
spinner.start()
stream.on 'end', ->
spinner.stop()
# We completely rely on the `mime` custom property
# to make this decision.
# The actual stream should be checked instead.
if stream.mime is 'application/zip'
output = unzip.Extract(path: options.output)
else
output = fs.createWriteStream(options.output)
return rindle.wait(stream.pipe(output)).return(options.output)
.tap (output) ->
console.info('The image was downloaded successfully')
.nodeify(done)
buildConfig = (image, deviceType, advanced = false) ->
form = require('resin-cli-form')
helpers = require('../utils/helpers')
helpers.getManifest(image, deviceType)
.get('options')
.then (questions) ->
if not advanced
advancedGroup = _.find questions,
name: 'advanced'
isGroup: true
if advancedGroup?
override = helpers.getGroupDefaults(advancedGroup)
return form.run(questions, { override })
exports.buildConfig =
signature: 'os build-config <image> <device-type>'
description: 'build the OS config and save it to the JSON file'
help: '''
Use this command to prebuild the OS config once and skip the interactive part of `resin os configure`.
Example:
$ resin os build-config ../path/rpi3.img raspberrypi3 --output rpi3-config.json
$ resin os configure ../path/rpi3.img 7cf02a6 --config "$(cat rpi3-config.json)"
'''
permission: 'user'
options: [
commandOptions.advancedConfig
{
signature: 'output'
description: 'the path to the output JSON file'
alias: 'o'
required: 'the output path is required'
parameter: 'output'
}
]
action: (params, options, done) ->
fs = require('fs')
Promise = require('bluebird')
writeFileAsync = Promise.promisify(fs.writeFile)
buildConfig(params.image, params['device-type'], options.advanced)
.then (answers) ->
writeFileAsync(options.output, JSON.stringify(answers, null, 4))
.nodeify(done)
exports.configure =
signature: 'os configure <image> [uuid] [deviceApiKey]'
description: 'configure an os image'
help: '''
Use this command to configure a previously downloaded operating system image for
the specific device or for an application generally.
Calling this command without --version is not recommended, and may fail in
future releases if the OS version cannot be inferred.
Note that device api keys are only supported on ResinOS 2.0.3+.
This command still supports the *deprecated* format where the UUID and optionally device key
are passed directly on the command line, but the recommended way is to pass either an --app or
--device argument. The deprecated format will be remove in a future release.
Examples:
$ resin os configure ../path/rpi.img --device 7cf02a6 --version 2.12.7
$ resin os configure ../path/rpi.img --device 7cf02a6 --version 2.12.7 --device-api-key <existingDeviceKey>
$ resin os configure ../path/rpi.img --app MyApp --version 2.12.7
'''
permission: 'user'
options: [
commandOptions.advancedConfig
commandOptions.optionalApplication
commandOptions.optionalDevice
commandOptions.optionalDeviceApiKey
commandOptions.optionalOsVersion
{
signature: 'config'
description: 'path to the config JSON file, see `resin os build-config`'
parameter: 'config'
}
]
action: (params, options, done) ->
normalizeUuidProp(params)
normalizeUuidProp(options, 'device')
fs = require('fs')
Promise = require('bluebird')
readFileAsync = Promise.promisify(fs.readFile)
resin = require('resin-sdk').fromSharedOptions()
init = require('resin-device-init')
helpers = require('../utils/helpers')
patterns = require('../utils/patterns')
{ generateDeviceConfig, generateApplicationConfig } = require('../utils/config')
if _.filter([
options.device
options.application
params.uuid
]).length != 1
patterns.exitWithExpectedError '''
To configure an image, you must provide exactly one of:
* A device, with --device <uuid>
* An application, with --app <appname>
* [Deprecated] A device, passing its uuid directly on the command line
See the help page for examples:
$ resin help os configure
'''
if params.uuid
console.warn(
'Directly passing a UUID to `resin os configure` is deprecated, and will stop working in future.\n' +
'Pass your device UUID with --device <uuid> instead.' +
if params.deviceApiKey
' Device api keys can be passed with --deviceApiKey.\n'
else '\n'
)
uuid = options.device || params.uuid
deviceApiKey = options.deviceApiKey || params.deviceApiKey
console.info('Configuring operating system image')
configurationResourceType = if uuid then 'device' else 'application'
resin.models[configurationResourceType].get(uuid || options.application)
.then (appOrDevice) ->
Promise.try ->
if options.config
return readFileAsync(options.config, 'utf8')
.then(JSON.parse)
return buildConfig(params.image, appOrDevice.device_type, options.advanced)
.then (answers) ->
answers.version = options.version
(if configurationResourceType == 'device'
generateDeviceConfig(appOrDevice, deviceApiKey, answers)
else
generateApplicationConfig(appOrDevice, answers)
).then (config) ->
init.configure(params.image, appOrDevice.device_type, config, answers)
.then(helpers.osProgressHandler)
.nodeify(done)
INIT_WARNING_MESSAGE = '''
Note: Initializing the device may ask for administrative permissions
because we need to access the raw devices directly.
'''
exports.initialize =
signature: 'os initialize <image>'
description: 'initialize an os image'
help: """
Use this command to initialize a device with previously configured operating system image.
#{INIT_WARNING_MESSAGE}
Examples:
$ resin os initialize ../path/rpi.img --type 'raspberry-pi'
"""
permission: 'user'
options: [
commandOptions.yes
{
signature: 'type'
description: 'device type (Check available types with `resin devices supported`)'
parameter: 'type'
alias: 't'
required: 'You have to specify a device type'
}
commandOptions.drive
]
action: (params, options, done) ->
Promise = require('bluebird')
umountAsync = Promise.promisify(require('umount').umount)
form = require('resin-cli-form')
patterns = require('../utils/patterns')
helpers = require('../utils/helpers')
console.info("""
Initializing device
#{INIT_WARNING_MESSAGE}
""")
helpers.getManifest(params.image, options.type)
.then (manifest) ->
return manifest.initialization?.options
.then (questions) ->
return form.run questions,
override:
drive: options.drive
.tap (answers) ->
return if not answers.drive?
patterns.confirm(
options.yes
"This will erase #{answers.drive}. Are you sure?"
"Going to erase #{answers.drive}."
)
.return(answers.drive)
.then(umountAsync)
.tap (answers) ->
return helpers.sudo([
'internal'
'osinit'
params.image
options.type
JSON.stringify(answers)
])
.then (answers) ->
return if not answers.drive?
# TODO: resin local makes use of ejectAsync, see below
# DO we need this / should we do that here?
# getDrive = (drive) ->
# driveListAsync().then (drives) ->
# selectedDrive = _.find(drives, device: drive)
# if not selectedDrive?
# throw new Error("Drive not found: #{drive}")
# return selectedDrive
# if (os.platform() is 'win32') and selectedDrive.mountpoint?
# ejectAsync = Promise.promisify(require('removedrive').eject)
# return ejectAsync(selectedDrive.mountpoint)
umountAsync(answers.drive).tap ->
console.info("You can safely remove #{answers.drive} now")
.nodeify(done)

277
lib/actions/preload.coffee Normal file
View File

@ -0,0 +1,277 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
dockerUtils = require('../utils/docker')
LATEST = 'latest'
getApplicationsWithSuccessfulBuilds = (deviceType) ->
preload = require('resin-preload')
resin = require('resin-sdk').fromSharedOptions()
resin.pine.get
resource: 'my_application'
options:
$filter:
device_type: deviceType
owns__release:
$any:
$alias: 'r'
$expr:
r:
status: 'success'
$expand: preload.applicationExpandOptions
$select: [ 'id', 'app_name', 'device_type', 'commit', 'should_track_latest_release' ]
$orderby: 'app_name asc'
selectApplication = (deviceType) ->
visuals = require('resin-cli-visuals')
form = require('resin-cli-form')
{ exitWithExpectedError } = require('../utils/patterns')
applicationInfoSpinner = new visuals.Spinner('Downloading list of applications and releases.')
applicationInfoSpinner.start()
getApplicationsWithSuccessfulBuilds(deviceType)
.then (applications) ->
applicationInfoSpinner.stop()
if applications.length == 0
exitWithExpectedError("You have no apps with successful releases for a '#{deviceType}' device type.")
form.ask
message: 'Select an application'
type: 'list'
choices: applications.map (app) ->
name: app.app_name
value: app
selectApplicationCommit = (releases) ->
form = require('resin-cli-form')
{ exitWithExpectedError } = require('../utils/patterns')
if releases.length == 0
exitWithExpectedError('This application has no successful releases.')
DEFAULT_CHOICE = { 'name': LATEST, 'value': LATEST }
choices = [ DEFAULT_CHOICE ].concat releases.map (release) ->
name: "#{release.end_timestamp} - #{release.commit}"
value: release.commit
return form.ask
message: 'Select a release'
type: 'list'
default: LATEST
choices: choices
offerToDisableAutomaticUpdates = (application, commit) ->
Promise = require('bluebird')
resin = require('resin-sdk').fromSharedOptions()
form = require('resin-cli-form')
if commit == LATEST or not application.should_track_latest_release
return Promise.resolve()
message = '''
This application is set to automatically update all devices to the latest available version.
This might be unexpected behaviour: with this enabled, the preloaded device will still
download and install the latest release once it is online.
Do you want to disable automatic updates for this application?
Warning: To re-enable this requires direct api calls,
see https://docs.resin.io/reference/api/resources/device/#set-device-to-release
'''
form.ask
message: message,
type: 'confirm'
.then (update) ->
if not update
return
resin.pine.patch
resource: 'application'
id: application.id
body:
should_track_latest_release: false
module.exports =
signature: 'preload <image>'
description: '(beta) preload an app on a disk image (or Edison zip archive)'
help: '''
Warning: "resin preload" requires Docker to be correctly installed in
your shell environment. For more information (including Windows support)
please check the README here: https://github.com/resin-io/resin-cli .
Use this command to preload an application to a local disk image (or
Edison zip archive) with a built release from Resin.io.
Examples:
$ resin preload resin.img --app 1234 --commit e1f2592fc6ee949e68756d4f4a48e49bff8d72a0 --splash-image some-image.png
$ resin preload resin.img
'''
permission: 'user'
primary: true
options: dockerUtils.appendConnectionOptions [
{
signature: 'app'
parameter: 'appId'
description: 'id of the application to preload'
alias: 'a'
}
{
signature: 'commit'
parameter: 'hash'
description: '''
the commit hash for a specific application release to preload, use "latest" to specify the latest release
(ignored if no appId is given)
'''
alias: 'c'
}
{
signature: 'splash-image'
parameter: 'splashImage.png'
description: 'path to a png image to replace the splash screen'
alias: 's'
}
{
signature: 'dont-check-device-type'
boolean: true
description: 'Disables check for matching device types in image and application'
}
{
signature: 'pin-device-to-release'
boolean: true
description: 'Pin the preloaded device to the preloaded release on provision'
alias: 'p'
}
]
action: (params, options, done) ->
_ = require('lodash')
Promise = require('bluebird')
resin = require('resin-sdk').fromSharedOptions()
preload = require('resin-preload')
visuals = require('resin-cli-visuals')
nodeCleanup = require('node-cleanup')
{ exitWithExpectedError } = require('../utils/patterns')
progressBars = {}
progressHandler = (event) ->
progressBar = progressBars[event.name]
if not progressBar
progressBar = progressBars[event.name] = new visuals.Progress(event.name)
progressBar.update(percentage: event.percentage)
spinners = {}
spinnerHandler = (event) ->
spinner = spinners[event.name]
if not spinner
spinner = spinners[event.name] = new visuals.Spinner(event.name)
if event.action == 'start'
spinner.start()
else
console.log()
spinner.stop()
options.image = params.image
options.appId = options.app
delete options.app
options.splashImage = options['splash-image']
delete options['splash-image']
options.dontCheckDeviceType = options['dont-check-device-type']
delete options['dont-check-device-type']
if options.dontCheckDeviceType and not options.appId
exitWithExpectedError('You need to specify an app id if you disable the device type check.')
options.pinDevice = options['pin-device-to-release']
delete options['pin-device-to-release']
# Get a configured dockerode instance
dockerUtils.getDocker(options)
.then (docker) ->
preloader = new preload.Preloader(
resin
docker
options.appId
options.commit
options.image
options.splashImage
options.proxy
options.dontCheckDeviceType
options.pinDevice
)
gotSignal = false
nodeCleanup (exitCode, signal) ->
if signal
gotSignal = true
nodeCleanup.uninstall() # don't call cleanup handler again
preloader.cleanup()
.then ->
# calling process.exit() won't inform parent process of signal
process.kill(process.pid, signal)
return false
if process.env.DEBUG
preloader.stderr.pipe(process.stderr)
preloader.on('progress', progressHandler)
preloader.on('spinner', spinnerHandler)
return new Promise (resolve, reject) ->
preloader.on('error', reject)
preloader.prepare()
.then ->
# If no appId was provided, show a list of matching apps
Promise.try ->
if not preloader.appId
selectApplication(preloader.config.deviceType)
.then (application) ->
preloader.setApplication(application)
.then ->
# Use the commit given as --commit or show an interactive commit selection menu
Promise.try ->
if options.commit
if options.commit == LATEST and preloader.application.commit
# handle `--commit latest`
return LATEST
release = _.find preloader.application.owns__release, (release) ->
release.commit.startsWith(options.commit)
if not release
exitWithExpectedError('There is no release matching this commit')
return release.commit
selectApplicationCommit(preloader.application.owns__release)
.then (commit) ->
if commit == LATEST
preloader.commit = preloader.application.commit
else
preloader.commit = commit
# Propose to disable automatic app updates if the commit is not the latest
offerToDisableAutomaticUpdates(preloader.application, commit)
.then ->
# All options are ready: preload the image.
preloader.preload()
.catch(resin.errors.ResinError, exitWithExpectedError)
.then(resolve)
.catch(reject)
.then(done)
.finally ->
if not gotSignal
preloader.cleanup()

161
lib/actions/push.ts Normal file
View File

@ -0,0 +1,161 @@
/*
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import { CommandDefinition } from 'capitano';
import { ResinSDK } from 'resin-sdk';
import { stripIndent } from 'common-tags';
async function getAppOwner(sdk: ResinSDK, appName: string) {
const {
exitWithExpectedError,
selectFromList,
} = await import('../utils/patterns');
const _ = await import('lodash');
const applications = await sdk.models.application.getAll({
$expand: {
user: {
$select: ['username'],
},
},
$filter: {
app_name: appName,
},
$select: ['id'],
});
if (applications == null || applications.length === 0) {
exitWithExpectedError(
stripIndent`
No applications found with name: ${appName}.
This could mean that the application does not exist, or you do
not have the permissions required to access it.`,
);
}
if (applications.length === 1) {
return _.get(applications, '[0].user[0].username');
}
// If we got more than one application with the same name it means that the
// user has access to a collab app with the same name as a personal app. We
// present a list to the user which shows the fully qualified application
// name (user/appname) and allows them to select
const entries = _.map(applications, app => {
const username = _.get(app, 'user[0].username');
return {
name: `${username}/${appName}`,
extra: username,
};
});
const selected = await selectFromList(
`${
entries.length
} applications found with that name, please select the application you would like to push to`,
entries,
);
return selected.extra;
}
export const push: CommandDefinition<
{
application: string;
},
{
source: string;
emulated: boolean;
nocache: boolean;
}
> = {
signature: 'push <application>',
description: 'Start a remote build on the resin.io cloud build servers',
help: stripIndent`
This command can be used to start a build on the remote
resin.io cloud builders. The given source directory will be sent to the
resin.io builder, and the build will proceed. This can be used as a drop-in
replacement for git push to deploy.
Examples:
$ resin push myApp
$ resin push myApp --source <source directory>
$ resin push myApp -s <source directory>
`,
permission: 'user',
options: [
{
signature: 'source',
alias: 's',
description:
'The source that should be sent to the resin builder to be built (defaults to the current directory)',
parameter: 'source',
},
{
signature: 'emulated',
alias: 'e',
description: 'Force an emulated build to occur on the remote builder',
boolean: true,
},
{
signature: 'nocache',
alias: 'c',
description: "Don't use cache when building this project",
boolean: true,
},
],
async action(params, options, done) {
const sdk = (await import('resin-sdk')).fromSharedOptions();
const Bluebird = await import('bluebird');
const remote = await import('../utils/remote-build');
const { exitWithExpectedError } = await import('../utils/patterns');
const app: string | null = params.application;
if (app == null) {
exitWithExpectedError('You must specify an application');
}
const source = options.source || '.';
if (process.env.DEBUG) {
console.log(`[debug] Using ${source} as build source`);
}
Bluebird.join(
sdk.auth.getToken(),
sdk.settings.get('resinUrl'),
getAppOwner(sdk, app),
(token, baseUrl, owner) => {
const opts = {
emulated: options.emulated,
nocache: options.nocache,
};
const args = {
app,
owner,
source,
auth: token,
baseUrl,
sdk,
opts,
};
return remote.startRemoteBuild(args);
},
).nodeify(done);
},
};

39
lib/actions/settings.ts Normal file
View File

@ -0,0 +1,39 @@
/*
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import { CommandDefinition } from 'capitano';
export const list: CommandDefinition = {
signature: 'settings',
description: 'print current settings',
help: `\
Use this command to display detected settings
Examples:
$ resin settings\
`,
async action(_params, _options, done) {
const resin = (await import('resin-sdk')).fromSharedOptions();
const prettyjson = await import('prettyjson');
return resin.settings
.getAll()
.then(prettyjson.render)
.then(console.log)
.nodeify(done);
},
};

141
lib/actions/ssh.coffee Normal file
View File

@ -0,0 +1,141 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
commandOptions = require('./command-options')
{ normalizeUuidProp } = require('../utils/normalization')
module.exports =
signature: 'ssh [uuid]'
description: '(beta) get a shell into the running app container of a device'
help: '''
Warning: 'resin ssh' requires an openssh-compatible client to be correctly
installed in your shell environment. For more information (including Windows
support) please check the README here: https://github.com/resin-io/resin-cli
Use this command to get a shell into the running application container of
your device.
Examples:
$ resin ssh MyApp
$ resin ssh 7cf02a6
$ resin ssh 7cf02a6 --port 8080
$ resin ssh 7cf02a6 -v
$ resin ssh 7cf02a6 -s
'''
permission: 'user'
primary: true
options: [
signature: 'port'
parameter: 'port'
description: 'ssh gateway port'
alias: 'p'
,
signature: 'verbose'
boolean: true
description: 'increase verbosity'
alias: 'v'
commandOptions.hostOSAccess,
signature: 'noproxy'
boolean: true
description: "don't use the proxy configuration for this connection.
Only makes sense if you've configured proxy globally."
]
action: (params, options, done) ->
normalizeUuidProp(params)
child_process = require('child_process')
Promise = require('bluebird')
resin = require('resin-sdk-preconfigured')
_ = require('lodash')
bash = require('bash')
hasbin = require('hasbin')
{ getSubShellCommand } = require('../utils/helpers')
patterns = require('../utils/patterns')
options.port ?= 22
verbose = if options.verbose then '-vvv' else ''
proxyConfig = global.PROXY_CONFIG
useProxy = !!proxyConfig and not options.noproxy
getSshProxyCommand = (hasTunnelBin) ->
return '' if not useProxy
if not hasTunnelBin
console.warn('''
Proxy is enabled but the `proxytunnel` binary cannot be found.
Please install it if you want to route the `resin ssh` requests through the proxy.
Alternatively you can pass `--noproxy` param to the `resin ssh` command to ignore the proxy config
for the `ssh` requests.
Attemmpting the unproxied request for now.
''')
return ''
tunnelOptions =
proxy: "#{proxyConfig.host}:#{proxyConfig.port}"
dest: '%h:%p'
{ proxyAuth } = proxyConfig
if proxyAuth
i = proxyAuth.indexOf(':')
_.assign tunnelOptions,
user: proxyAuth.substring(0, i)
pass: proxyAuth.substring(i + 1)
proxyCommand = "proxytunnel #{bash.args(tunnelOptions, '--', '=')}"
return "-o #{bash.args({ ProxyCommand: proxyCommand }, '', '=')}"
Promise.try ->
return false if not params.uuid
return resin.models.device.has(params.uuid)
.then (uuidExists) ->
return params.uuid if uuidExists
return patterns.inferOrSelectDevice()
.then (uuid) ->
console.info("Connecting to: #{uuid}")
resin.models.device.get(uuid)
.then (device) ->
patterns.exitWithExpectedError('Device is not online') if not device.is_online
Promise.props
username: resin.auth.whoami()
uuid: device.uuid
# get full uuid
containerId: if options.host then '' else resin.models.device.getApplicationInfo(device.uuid).get('containerId')
proxyUrl: resin.settings.get('proxyUrl')
hasTunnelBin: if useProxy then hasbin('proxytunnel') else null
.then ({ username, uuid, containerId, proxyUrl, hasTunnelBin }) ->
throw new Error('Did not find running application container') if not containerId?
Promise.try ->
sshProxyCommand = getSshProxyCommand(hasTunnelBin)
if options.host
accessCommand = "host #{uuid}"
else
accessCommand = "enter #{uuid} #{containerId}"
command = "ssh #{verbose} -t \
-o LogLevel=ERROR \
-o StrictHostKeyChecking=no \
-o UserKnownHostsFile=/dev/null \
#{sshProxyCommand} \
-p #{options.port} #{username}@ssh.#{proxyUrl} #{accessCommand}"
subShellCommand = getSubShellCommand(command)
child_process.spawn subShellCommand.program, subShellCommand.args,
stdio: 'inherit'
.nodeify(done)

View File

@ -1,5 +1,5 @@
/*
Copyright 2020 Balena
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
@ -14,6 +14,5 @@ See the License for the specific language governing permissions and
limitations under the License.
*/
import type { DataOutputOptions, DataSetOutputOptions } from './output';
export { DataOutputOptions, DataSetOutputOptions };
import * as ResinSync from 'resin-sync';
export = ResinSync.capitano('resin-cli');

56
lib/actions/util.coffee Normal file
View File

@ -0,0 +1,56 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
_ = require('lodash')
exports.availableDrives =
# TODO: dedupe with https://github.com/resin-io-modules/resin-cli-visuals/blob/master/lib/widgets/drive/index.coffee
signature: 'util available-drives'
description: 'list available drives'
help: """
Use this command to list your machine's drives usable for writing the OS image to.
Skips the system drives.
"""
action: ->
Promise = require('bluebird')
drivelist = require('drivelist')
driveListAsync = Promise.promisify(drivelist.list)
chalk = require('chalk')
visuals = require('resin-cli-visuals')
formatDrive = (drive) ->
size = drive.size / 1000000000
return {
device: drive.device
size: "#{size.toFixed(1)} GB"
description: drive.description
}
getDrives = ->
driveListAsync().then (drives) ->
return _.reject(drives, system: true)
getDrives()
.then (drives) ->
if not drives.length
console.error("#{chalk.red('x')} No available drives were detected, plug one in!")
return
console.log visuals.table.horizontal drives.map(formatDrive), [
'device'
'size'
'description'
]

74
lib/actions/wizard.coffee Normal file
View File

@ -0,0 +1,74 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
exports.wizard =
signature: 'quickstart [name]'
description: 'getting started with resin.io'
help: '''
Use this command to run a friendly wizard to get started with resin.io.
The wizard will guide you through:
- Create an application.
- Initialise an SDCard with the resin.io operating system.
- Associate an existing project directory with your resin.io application.
- Push your project to your devices.
Examples:
$ resin quickstart
$ resin quickstart MyApp
'''
primary: true
action: (params, options, done) ->
resin = require('resin-sdk').fromSharedOptions()
patterns = require('../utils/patterns')
{ runCommand } = require('../utils/helpers')
resin.auth.isLoggedIn().then (isLoggedIn) ->
return if isLoggedIn
console.info('Looks like you\'re not logged in yet!')
console.info("Let's go through a quick wizard to get you started.\n")
return runCommand('login')
.then ->
return if params.name?
patterns.selectOrCreateApplication().tap (applicationName) ->
resin.models.application.has(applicationName).then (hasApplication) ->
return applicationName if hasApplication
runCommand("app create #{applicationName}")
.then (applicationName) ->
params.name = applicationName
.then ->
return runCommand("device init --application #{params.name}")
.tap(patterns.awaitDevice)
.then (uuid) ->
return runCommand("device #{uuid}")
.then ->
return resin.models.application.get(params.name)
.then (application) ->
console.log """
Your device is ready to start pushing some code!
Check our official documentation for more information:
http://docs.resin.io/#/pages/introduction/introduction.md
Clone an example or go to an existing application directory and run:
$ git remote add resin #{application.git_repository}
$ git push resin master
"""
.nodeify(done)

234
lib/app.coffee Normal file
View File

@ -0,0 +1,234 @@
###
Copyright 2016-2017 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
# Assign bluebird as the global promise library
# stream-to-promise will produce native promises if not
# for this module, which could wreak havoc in this
# bluebird-only codebase.
require('any-promise/register/bluebird')
Raven = require('raven')
Raven.disableConsoleAlerts()
Raven.config require('./config').sentryDsn,
captureUnhandledRejections: true,
autoBreadcrumbs: true,
release: require('../package.json').version
.install (logged, error) ->
console.error(error)
process.exit(1)
Raven.setContext
extra:
args: process.argv
node_version: process.version
validNodeVersions = require('../package.json').engines.node
if not require('semver').satisfies(process.version, validNodeVersions)
console.warn """
Warning: this version of Node does not match the requirements of this package.
This package expects #{validNodeVersions}, but you're using #{process.version}.
This may cause unexpected behaviour.
To upgrade your Node, visit https://nodejs.org/en/download/
"""
# Doing this before requiring any other modules,
# including the 'resin-sdk', to prevent any module from reading the http proxy config
# before us
globalTunnel = require('global-tunnel-ng')
settings = require('resin-settings-client')
try
proxy = settings.get('proxy') or null
catch
proxy = null
# Init the tunnel even if the proxy is not configured
# because it can also get the proxy from the http(s)_proxy env var
# If that is not set as well the initialize will do nothing
globalTunnel.initialize(proxy)
# TODO: make this a feature of capitano https://github.com/resin-io/capitano/issues/48
global.PROXY_CONFIG = globalTunnel.proxyConfig
Promise = require('bluebird')
capitano = require('capitano')
capitanoExecuteAsync = Promise.promisify(capitano.execute)
# We don't yet use resin-sdk directly everywhere, but we set up shared
# options correctly so we can do safely in submodules
ResinSdk = require('resin-sdk')
ResinSdk.setSharedOptions(
apiUrl: settings.get('apiUrl')
imageMakerUrl: settings.get('imageMakerUrl')
dataDirectory: settings.get('dataDirectory')
retries: 2
)
resin = ResinSdk.fromSharedOptions()
actions = require('./actions')
errors = require('./errors')
events = require('./events')
update = require('./utils/update')
{ exitWithExpectedError } = require('./utils/patterns')
capitano.permission 'user', (done) ->
resin.auth.isLoggedIn().then (isLoggedIn) ->
if not isLoggedIn
exitWithExpectedError('''
You have to log in to continue
Run the following command to go through the login wizard:
$ resin login
''')
.nodeify(done)
capitano.command
signature: '*'
action: ->
capitano.execute(command: 'help')
capitano.globalOption
signature: 'help'
boolean: true
alias: 'h'
# ---------- Info Module ----------
capitano.command(actions.info.version)
# ---------- Help Module ----------
capitano.command(actions.help.help)
# ---------- Wizard Module ----------
capitano.command(actions.wizard.wizard)
# ---------- Api key module ----------
capitano.command(actions.apiKey.generate)
# ---------- App Module ----------
capitano.command(actions.app.create)
capitano.command(actions.app.list)
capitano.command(actions.app.remove)
capitano.command(actions.app.restart)
capitano.command(actions.app.info)
# ---------- Auth Module ----------
capitano.command(actions.auth.login)
capitano.command(actions.auth.logout)
capitano.command(actions.auth.signup)
capitano.command(actions.auth.whoami)
# ---------- Device Module ----------
capitano.command(actions.device.list)
capitano.command(actions.device.supported)
capitano.command(actions.device.rename)
capitano.command(actions.device.init)
capitano.command(actions.device.remove)
capitano.command(actions.device.identify)
capitano.command(actions.device.reboot)
capitano.command(actions.device.shutdown)
capitano.command(actions.device.enableDeviceUrl)
capitano.command(actions.device.disableDeviceUrl)
capitano.command(actions.device.getDeviceUrl)
capitano.command(actions.device.hasDeviceUrl)
capitano.command(actions.device.register)
capitano.command(actions.device.move)
capitano.command(actions.device.info)
# ---------- Notes Module ----------
capitano.command(actions.notes.set)
# ---------- Keys Module ----------
capitano.command(actions.keys.list)
capitano.command(actions.keys.add)
capitano.command(actions.keys.info)
capitano.command(actions.keys.remove)
# ---------- Env Module ----------
capitano.command(actions.env.list)
capitano.command(actions.env.add)
capitano.command(actions.env.rename)
capitano.command(actions.env.remove)
# ---------- OS Module ----------
capitano.command(actions.os.versions)
capitano.command(actions.os.download)
capitano.command(actions.os.buildConfig)
capitano.command(actions.os.configure)
capitano.command(actions.os.initialize)
# ---------- Config Module ----------
capitano.command(actions.config.read)
capitano.command(actions.config.write)
capitano.command(actions.config.inject)
capitano.command(actions.config.reconfigure)
capitano.command(actions.config.generate)
# ---------- Settings Module ----------
capitano.command(actions.settings.list)
# ---------- Logs Module ----------
capitano.command(actions.logs)
# ---------- Sync Module ----------
capitano.command(actions.sync)
# ---------- Preload Module ----------
capitano.command(actions.preload)
# ---------- SSH Module ----------
capitano.command(actions.ssh)
# ---------- Local ResinOS Module ----------
capitano.command(actions.local.configure)
capitano.command(actions.local.flash)
capitano.command(actions.local.logs)
capitano.command(actions.local.push)
capitano.command(actions.local.ssh)
capitano.command(actions.local.scan)
capitano.command(actions.local.stop)
# ---------- Public utils ----------
capitano.command(actions.util.availableDrives)
# ---------- Internal utils ----------
capitano.command(actions.internal.osInit)
capitano.command(actions.internal.scanDevices)
capitano.command(actions.internal.sudo)
#------------ Local build and deploy -------
capitano.command(actions.build)
capitano.command(actions.deploy)
#------------ Push/remote builds -------
capitano.command(actions.push.push)
#------------ Join/Leave -------
capitano.command(actions.join.join)
capitano.command(actions.leave.leave)
update.notify()
cli = capitano.parse(process.argv)
runCommand = ->
if cli.global?.help
capitanoExecuteAsync(command: "help #{cli.command ? ''}")
else
capitanoExecuteAsync(cli)
Promise.all([events.trackCommand(cli), runCommand()])
.catch(errors.handle)

View File

@ -1,188 +0,0 @@
/**
* @license
* Copyright 2019-2020 Balena Ltd.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import * as packageJSON from '../package.json';
import {
AppOptions,
checkDeletedCommand,
preparseArgs,
unsupportedFlag,
} from './preparser';
import { CliSettings } from './utils/bootstrap';
import { onceAsync } from './utils/lazy';
/**
* Sentry.io setup
* @see https://docs.sentry.io/error-reporting/quickstart/?platform=node
*/
export const setupSentry = onceAsync(async () => {
const config = await import('./config');
const Sentry = await import('@sentry/node');
Sentry.init({
autoSessionTracking: false,
dsn: config.sentryDsn,
release: packageJSON.version,
});
Sentry.configureScope((scope) => {
scope.setExtras({
is_pkg: !!(process as any).pkg,
node_version: process.version,
platform: process.platform,
});
});
return Sentry.getCurrentHub();
});
async function checkNodeVersion() {
const validNodeVersions = packageJSON.engines.node;
if (!(await import('semver')).satisfies(process.version, validNodeVersions)) {
const { getNodeEngineVersionWarn } = await import('./utils/messages');
console.warn(getNodeEngineVersionWarn(process.version, validNodeVersions));
}
}
/** Setup balena-sdk options that are shared with imported packages */
function setupBalenaSdkSharedOptions(settings: CliSettings) {
const BalenaSdk = require('balena-sdk') as typeof import('balena-sdk');
BalenaSdk.setSharedOptions({
apiUrl: settings.get<string>('apiUrl'),
dataDirectory: settings.get<string>('dataDirectory'),
});
}
/**
* Addresses the console warning:
* (node:49500) MaxListenersExceededWarning: Possible EventEmitter memory
* leak detected. 11 error listeners added. Use emitter.setMaxListeners() to
* increase limit
*/
export function setMaxListeners(maxListeners: number) {
require('events').EventEmitter.defaultMaxListeners = maxListeners;
}
/** Selected CLI initialization steps */
async function init() {
if (process.env.BALENARC_NO_SENTRY) {
if (process.env.DEBUG) {
console.error(`WARN: disabling Sentry.io error reporting`);
}
} else {
await setupSentry();
}
await checkNodeVersion();
const settings = new CliSettings();
// Proxy setup should be done early on, before loading balena-sdk
await (await import('./utils/proxy')).setupGlobalHttpProxy(settings);
setupBalenaSdkSharedOptions(settings);
// check for CLI updates once a day
if (!process.env.BALENARC_OFFLINE_MODE) {
(await import('./utils/update')).notify();
}
}
/** Execute the oclif parser and the CLI command. */
async function oclifRun(command: string[], options: AppOptions) {
let deprecationPromise: Promise<void>;
// check and enforce the CLI's deprecation policy
if (unsupportedFlag || process.env.BALENARC_UNSUPPORTED) {
deprecationPromise = Promise.resolve();
} else {
const { DeprecationChecker } = await import('./deprecation');
const deprecationChecker = new DeprecationChecker(packageJSON.version);
// warnAndAbortIfDeprecated uses previously cached data only
await deprecationChecker.warnAndAbortIfDeprecated();
// checkForNewReleasesIfNeeded may query the npm registry
deprecationPromise = deprecationChecker.checkForNewReleasesIfNeeded();
}
const runPromise = (async function (shouldFlush: boolean) {
const { CustomMain } = await import('./utils/oclif-utils');
let isEEXIT = false;
try {
await CustomMain.run(command);
} catch (error) {
// oclif sometimes exits with ExitError code EEXIT 0 (not an error),
// for example the `balena help` command.
// (Avoid `error instanceof ExitError` here for the reasons explained
// in the CONTRIBUTING.md file regarding the `instanceof` operator.)
if (error.oclif?.exit === 0) {
isEEXIT = true;
} else {
throw error;
}
}
if (shouldFlush) {
await import('@oclif/command/flush');
}
// TODO: figure out why we need to call fast-boot stop() here, in
// addition to calling it in the main `run()` function in this file.
// If it is not called here as well, there is a process exit delay of
// 1 second when the fast-boot2 cache is modified (1 second is the
// default cache saving timeout). Try for example `balena help`.
// I have found that, when oclif's `Error: EEXIT: 0` is caught in
// the try/catch block above, execution does not get past the
// Promise.all() call below, but I don't understand why.
if (isEEXIT) {
(await import('./fast-boot')).stop();
}
})(!options.noFlush);
const { trackPromise } = await import('./hooks/prerun/track');
await Promise.all([trackPromise, deprecationPromise, runPromise]);
}
/** CLI entrypoint. Called by the `bin/balena` and `bin/balena-dev` scripts. */
export async function run(cliArgs = process.argv, options: AppOptions = {}) {
try {
const { setOfflineModeEnvVars, normalizeEnvVars, pkgExec } = await import(
'./utils/bootstrap'
);
setOfflineModeEnvVars();
normalizeEnvVars();
// The 'pkgExec' special/internal command provides a Node.js interpreter
// for use of the standalone zip package. See pkgExec function.
if (cliArgs.length > 3 && cliArgs[2] === 'pkgExec') {
return pkgExec(cliArgs[3], cliArgs.slice(4));
}
await init();
// Look for commands that have been removed and if so, exit with a notice
checkDeletedCommand(cliArgs.slice(2));
const args = await preparseArgs(cliArgs);
await oclifRun(args, options);
} catch (err) {
await (await import('./errors')).handleError(err);
} finally {
try {
(await import('./fast-boot')).stop();
} catch (e) {
if (process.env.DEBUG) {
console.error(`[debug] Stopping fast-boot: ${e}`);
}
}
// Windows fix: reading from stdin prevents the process from exiting
process.stdin.pause();
}
}

63
lib/auth/index.coffee Normal file
View File

@ -0,0 +1,63 @@
###
Copyright 2016 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
###*
# @module auth
###
open = require('opn')
resin = require('resin-sdk').fromSharedOptions()
server = require('./server')
utils = require('./utils')
###*
# @summary Login to the Resin CLI using the web dashboard
# @function
# @public
#
# @description
# This function opens the user's default browser and points it
# to the Resin.io dashboard where the session token exchange will
# take place.
#
# Once the the token is retrieved, it's automatically persisted.
#
# @fulfil {String} - session token
# @returns {Promise}
#
# @example
# auth.login().then (sessionToken) ->
# console.log('I\'m logged in!')
# console.log("My session token is: #{sessionToken}")
###
exports.login = ->
options =
port: 8989
path: '/auth'
# Needs to be 127.0.0.1 not localhost, because the ip only is whitelisted
# from mixed content warnings (as the target of a form in the result page)
callbackUrl = "http://127.0.0.1:#{options.port}#{options.path}"
return utils.getDashboardLoginURL(callbackUrl).then (loginUrl) ->
# Leave a bit of time for the
# server to get up and runing
setTimeout ->
open(loginUrl, { wait: false })
, 1000
return server.awaitForToken(options)
.tap(resin.auth.loginWithToken)

View File

@ -1,66 +0,0 @@
/*
Copyright 2016-2020 Balena
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import { getBalenaSdk } from '../utils/lazy';
import { LoginServer } from './server';
/**
* @module auth
*/
/**
* @summary Login to the balena CLI using the web dashboard
* @function
* @public
*
* @description
* This function opens the user's default browser and points it
* to the balena dashboard where the session token exchange will
* take place.
*
* Once the the token is retrieved, it's automatically persisted.
*
* @fulfil {String} - session token
* @returns {Promise}
*
* @example
* auth.login().then (sessionToken) ->
* console.log('I\'m logged in!')
* console.log("My session token is: #{sessionToken}")
*/
export async function login({ host = '127.0.0.1', port = 0 }) {
const utils = await import('./utils');
const loginServer = new LoginServer();
const {
host: actualHost,
port: actualPort,
urlPath,
} = await loginServer.start({ host, port });
const callbackUrl = `http://${actualHost}:${actualPort}${urlPath}`;
const loginUrl = await utils.getDashboardLoginURL(callbackUrl);
console.info(`Opening web browser for URL:\n${loginUrl}`);
const open = await import('open');
await open(loginUrl, { wait: false });
const balena = getBalenaSdk();
const token = await loginServer.awaitForToken();
await balena.auth.loginWithToken(token);
loginServer.shutdown();
return token;
}

View File

@ -3,7 +3,7 @@
<head>
<meta charset="utf-8">
<meta http-equiv="x-ua-compatible" content="ie=edge">
<title>balena CLI - Error</title>
<title>Resin CLI - Error</title>
<meta name="description" content="">
<meta name="viewport" content="width=device-width, initial-scale=1">
<link rel="stylesheet" type="text/css" href="./static/style.css" inline>
@ -12,11 +12,10 @@
<div class="center">
<img class="icon" src="./static/images/sad.png" inline>
<h1>Something went wrong</h1>
<br>
<p>The balena CLI login was not successful.</p>
<p>You couldn't login to the Resin CLI for some reason</p>
<br>
<br>
<a href="https://forums.balena.io/" class="button danger">Get help in our forums</a>
<a href="https://forums.resin.io/" class="button danger">Get help in our forums</a>
</div>
</body>
</html>

View File

@ -3,7 +3,7 @@
<head>
<meta charset="utf-8">
<meta http-equiv="x-ua-compatible" content="ie=edge">
<title>balena CLI - Success</title>
<title>Resin CLI - Success</title>
<meta name="description" content="">
<meta name="viewport" content="width=device-width, initial-scale=1">
<link rel="stylesheet" type="text/css" href="./static/style.css" inline>
@ -12,9 +12,10 @@
<div class="center">
<img class="icon" src="./static/images/happy.png" inline>
<h1>Success!</h1>
<p>You successfully logged in the Resin CLI</p>
<br>
<p>The balena CLI login was successful.</p>
<p>You may now close this page and return to the command prompt.</p>
<br>
<a href="<%= dashboardUrl %>" class="button normal">Go to the dashboard</a>
</div>
</body>
</html>

101
lib/auth/server.coffee Normal file
View File

@ -0,0 +1,101 @@
###
Copyright 2016 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
express = require('express')
path = require('path')
bodyParser = require('body-parser')
Promise = require('bluebird')
resin = require('resin-sdk-preconfigured')
utils = require('./utils')
createServer = ({ port, isDev } = {}) ->
app = express()
app.use bodyParser.urlencoded
extended: true
app.set('view engine', 'ejs')
app.set('views', path.join(__dirname, 'pages'))
if isDev
app.use(express.static(path.join(__dirname, 'pages', 'static')))
server = app.listen(port)
return { app, server }
###*
# @summary Await for token
# @function
# @protected
#
# @param {Object} options - options
# @param {String} options.path - callback path
# @param {Number} options.port - http port
#
# @example
# server.awaitForToken
# path: '/auth'
# port: 9001
# .then (token) ->
# console.log(token)
###
exports.awaitForToken = (options) ->
{ app, server } = createServer(port: options.port)
return new Promise (resolve, reject) ->
closeServer = (errorMessage, successPayload) ->
server.close ->
if errorMessage
reject(new Error(errorMessage))
return
resolve(successPayload)
renderAndDone = ({ request, response, viewName, errorMessage, statusCode, token }) ->
return getContext(viewName)
.then (context) ->
response.status(statusCode || 200).render(viewName, context)
request.connection.destroy()
closeServer(errorMessage, token)
app.post options.path, (request, response) ->
token = request.body.token?.trim()
Promise.try ->
if not token
throw new Error('No token')
return utils.loginIfTokenValid(token)
.tap (loggedIn) ->
if not loggedIn
throw new Error('Invalid token')
.then ->
renderAndDone({ request, response, viewName: 'success', token })
.catch (error) ->
renderAndDone({
request, response, viewName: 'error',
statusCode: 401, errorMessage: error.message
})
app.use (request, response) ->
response.status(404).send('Not found')
closeServer('Unknown path or verb')
exports.getContext = getContext = (viewName) ->
if viewName is 'success'
return Promise.props
dashboardUrl: resin.settings.get('dashboardUrl')
return Promise.resolve({})

View File

@ -1,149 +0,0 @@
/*
Copyright 2016-2020 Balena
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import * as bodyParser from 'body-parser';
import { EventEmitter } from 'events';
import * as express from 'express';
import type { Socket } from 'net';
import * as path from 'path';
import * as utils from './utils';
import { ExpectedError } from '../errors';
export class LoginServer extends EventEmitter {
protected expressApp: express.Express;
protected server: import('net').Server;
protected serverSockets: Socket[] = [];
protected firstError: Error;
protected token: string;
public readonly loginPath = '/auth';
/**
* Start the HTTP server, listening on the given IP address and port number.
* If the port number is 0, the OS will allocate a free port number.
*/
public async start({ host = '127.0.0.1', port = 0 } = {}): Promise<{
host: string;
port: number;
urlPath: string;
}> {
this.once('error', (err: Error) => {
this.firstError = err;
});
this.on('token', (token: string) => {
this.token = token;
});
const app = (this.expressApp = express());
app.use(
bodyParser.urlencoded({
extended: true,
}),
);
app.set('view engine', 'ejs');
app.set('views', path.join(__dirname, 'pages'));
this.server = await new Promise<import('net').Server>((resolve, reject) => {
const callback = (err: Error) => {
if (err) {
this.emit('error', err);
reject(err);
} else {
resolve(server);
}
};
const server = app.listen(port, host, callback as any);
server.on('connection', (socket) => this.serverSockets.push(socket));
});
this.expressApp.post(this.loginPath, async (request, response) => {
this.server.close(); // stop listening for new connections
try {
const token = request.body.token?.trim();
if (!token) {
throw new ExpectedError('No token');
}
const loggedIn = await utils.loginIfTokenValid(token);
if (!loggedIn) {
throw new ExpectedError('Invalid token');
}
this.emit('token', token);
response.status(200).render('success');
} catch (error) {
this.emit('error', error);
response.status(401).render('error');
}
});
this.expressApp.use((_request, response) => {
this.server.close(); // stop listening for new connections
this.emit('error', new Error('Unknown path or verb'));
response.status(404).send('Not found');
});
return this.getAddress();
}
public getAddress(): { host: string; port: number; urlPath: string } {
const info = this.server.address() as import('net').AddressInfo;
return {
host: info.address,
port: info.port,
urlPath: this.loginPath,
};
}
/**
* Shut the server down.
* Call this method to avoid the process hanging in some situations.
*/
public shutdown() {
// A Node.js `http.server` instance prevents the process from exiting for up to
// 2 minutes (by default) if a client keeps a HTTP connection open, and regardless
// of whether `server.close()` was called: the `server.close(callback)` callback
// takes just as long to complete. Setting `server.timeout` to some value like
// 3 seconds works, but then the CLI process hangs for "only" 3 seconds. Reducing
// the timeout to 1 second may cause authentication failure if the laptop or CI
// server are slow for any reason. The only reliable way around it seems to be to
// explicitly unref the sockets, so the event loop stops waiting for it. See:
// https://github.com/nodejs/node/issues/2642
// https://github.com/nodejs/node-v0.x-archive/issues/9066
//
this.serverSockets.forEach((s) => s.unref());
this.serverSockets.splice(0);
}
/**
* Await for the user to complete login through a web browser.
* Resolve to the authentication token string.
*
* @return Promise that resolves to the authentication token string
*/
public async awaitForToken(): Promise<string> {
if (this.firstError) {
throw this.firstError;
}
if (this.token) {
return this.token;
}
return new Promise<string>((resolve, reject) => {
this.on('error', reject);
this.on('token', resolve);
});
}
}

80
lib/auth/utils.coffee Normal file
View File

@ -0,0 +1,80 @@
###
Copyright 2016 Resin.io
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
###
resin = require('resin-sdk').fromSharedOptions()
_ = require('lodash')
url = require('url')
Promise = require('bluebird')
###*
# @summary Get dashboard CLI login URL
# @function
# @protected
#
# @param {String} callbackUrl - callback url
# @fulfil {String} - dashboard login url
# @returns {Promise}
#
# @example
# utils.getDashboardLoginURL('http://127.0.0.1:3000').then (url) ->
# console.log(url)
###
exports.getDashboardLoginURL = (callbackUrl) ->
# Encode percentages signs from the escaped url
# characters to avoid angular getting confused.
callbackUrl = encodeURIComponent(callbackUrl).replace(/%/g, '%25')
resin.settings.get('dashboardUrl').then (dashboardUrl) ->
return url.resolve(dashboardUrl, "/login/cli/#{callbackUrl}")
###*
# @summary Log in using a token, but only if the token is valid
# @function
# @protected
#
# @description
# This function checks that the token is not only well-structured
# but that it also authenticates with the server successfully.
#
# If authenticated, the token is persisted, if not then the previous
# login state is restored.
#
# @param {String} token - session token or api key
# @fulfil {Boolean} - whether the login was successful or not
# @returns {Promise}
#
# utils.loginIfTokenValid('...').then (loggedIn) ->
# if loggedIn
# console.log('Token is valid!')
###
exports.loginIfTokenValid = (token) ->
if not token? or _.isEmpty(token.trim())
return Promise.resolve(false)
return resin.auth.getToken()
.catchReturn(undefined)
.then (currentToken) ->
resin.auth.loginWithToken(token)
.return(token)
.then(resin.auth.isLoggedIn)
.tap (isLoggedIn) ->
return if isLoggedIn
if currentToken?
return resin.auth.loginWithToken(currentToken)
else
return resin.auth.logout()

View File

@ -1,76 +0,0 @@
/*
Copyright 2016 Balena
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
*/
import { getBalenaSdk } from '../utils/lazy';
/**
* Get dashboard CLI login URL
*
* @param callbackUrl - Callback url, e.g. 'http://127.0.0.1:3000'
* @returns Dashboard login URL, e.g.:
* 'https://dashboard.balena-cloud.com/login/cli/http%253A%252F%252F127.0.0.1%253A59581%252Fauth'
*/
export async function getDashboardLoginURL(
callbackUrl: string,
): Promise<string> {
// Encode percentages signs from the escaped url
// characters to avoid angular getting confused.
callbackUrl = encodeURIComponent(callbackUrl).replace(/%/g, '%25');
const [{ URL }, dashboardUrl] = await Promise.all([
import('url'),
getBalenaSdk().settings.get('dashboardUrl'),
]);
return new URL(`/login/cli/${callbackUrl}`, dashboardUrl).href;
}
/**
* Log in using a token, but only if the token is valid.
*
* This function checks that the token is not only well-structured
* but that it also authenticates with the server successfully.
*
* If authenticated, the token is persisted, if not then the previous
* login state is restored.
*
* @param token - session token or api key
* @returns whether the login was successful or not
*/
export async function loginIfTokenValid(token?: string): Promise<boolean> {
token = (token || '').trim();
if (!token) {
return false;
}
const balena = getBalenaSdk();
let currentToken;
try {
currentToken = await balena.auth.getToken();
} catch {
// ignore
}
await balena.auth.loginWithToken(token);
const isLoggedIn = await balena.auth.isLoggedIn();
if (!isLoggedIn) {
if (currentToken != null) {
await balena.auth.loginWithToken(currentToken);
} else {
await balena.auth.logout();
}
}
return isLoggedIn;
}

View File

@ -1,174 +0,0 @@
/**
* @license
* Copyright 2020 Balena Ltd.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import Command from '@oclif/command';
import {
InsufficientPrivilegesError,
NotAvailableInOfflineModeError,
} from './errors';
import { stripIndent } from './utils/lazy';
import * as output from './framework/output';
export default abstract class BalenaCommand extends Command {
/**
* When set to true, command will be listed in `help`,
* otherwise listed in `help --verbose` with secondary commands.
*/
public static primary = false;
/**
* Require elevated privileges to run.
* When set to true, command will exit with an error
* if executed without root on Mac/Linux
* or if executed by non-Administrator on Windows.
*/
public static root = false;
/**
* Require authentication to run.
* When set to true, command will exit with an error
* if user is not already logged in.
*/
public static authenticated = false;
/**
* Require an internet connection to run.
* When set to true, command will exit with an error
* if user is running in offline mode (BALENARC_OFFLINE_MODE).
*/
public static offlineCompatible = false;
/**
* Accept piped input.
* When set to true, command will read from stdin during init
* and make contents available on member `stdin`.
*/
public static readStdin = false;
public stdin: string;
/**
* Throw InsufficientPrivilegesError if not root on Mac/Linux
* or non-Administrator on Windows.
*
* Called automatically if `root=true`.
* Can be called explicitly by command implementation, if e.g.:
* - check should only be done conditionally
* - other code needs to execute before check
*/
protected static async checkElevatedPrivileges() {
const isElevated = await (await import('is-elevated'))();
if (!isElevated) {
throw new InsufficientPrivilegesError(
'You need root/admin privileges to run this command',
);
}
}
/**
* Throw NotLoggedInError if not logged in.
*
* Called automatically if `authenticated=true`.
* Can be called explicitly by command implementation, if e.g.:
* - check should only be done conditionally
* - other code needs to execute before check
*
* Note, currently public to allow use outside of derived commands
* (as some command implementations require this. Can be made protected
* if this changes).
*
* @throws {NotLoggedInError}
*/
public static async checkLoggedIn() {
await (await import('./utils/patterns')).checkLoggedIn();
}
/**
* Throw NotLoggedInError if not logged in when condition true.
*
* @param {boolean} doCheck - will check if true.
* @throws {NotLoggedInError}
*/
public static async checkLoggedInIf(doCheck: boolean) {
if (doCheck) {
await this.checkLoggedIn();
}
}
/**
* Throw NotAvailableInOfflineModeError if in offline mode.
*
* Called automatically if `onlineOnly=true`.
* Can be called explicitly by command implementation, if e.g.:
* - check should only be done conditionally
* - other code needs to execute before check
*
* Note, currently public to allow use outside of derived commands
* (as some command implementations require this. Can be made protected
* if this changes).
*
* @throws {NotAvailableInOfflineModeError}
*/
public static checkNotUsingOfflineMode() {
if (process.env.BALENARC_OFFLINE_MODE) {
throw new NotAvailableInOfflineModeError(stripIndent`
This command requires an internet connection, and cannot be used in offline mode.
To leave offline mode, unset the BALENARC_OFFLINE_MODE environment variable.
`);
}
}
/**
* Read stdin contents and make available to command.
*
* This approach could be improved in the future to automatically set argument
* values from stdin based in configuration, minimising command implementation.
*/
protected async getStdin() {
this.stdin = await (await import('get-stdin'))();
}
/**
* Get a logger instance.
*/
protected static async getLogger() {
return (await import('./utils/logger')).getLogger();
}
protected async init() {
const ctr = this.constructor as typeof BalenaCommand;
if (ctr.root) {
await BalenaCommand.checkElevatedPrivileges();
}
if (ctr.authenticated) {
await BalenaCommand.checkLoggedIn();
}
if (!ctr.offlineCompatible) {
BalenaCommand.checkNotUsingOfflineMode();
}
if (ctr.readStdin) {
await this.getStdin();
}
}
protected outputMessage = output.outputMessage;
protected outputData = output.outputData;
}

View File

@ -1,85 +0,0 @@
/**
* @license
* Copyright 2016-2020 Balena Ltd.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/
import { flags } from '@oclif/command';
import Command from '../../command';
import { ExpectedError } from '../../errors';
import * as cf from '../../utils/common-flags';
import { getBalenaSdk, stripIndent } from '../../utils/lazy';
interface FlagsDef {
help: void;
}
interface ArgsDef {
name: string;
}
export default class GenerateCmd extends Command {
public static description = stripIndent`
Generate a new balenaCloud API key.
Generate a new balenaCloud API key for the current user, with the given
name. The key will be logged to the console.
This key can be used to log into the CLI using 'balena login --token <key>',
or to authenticate requests to the API with an 'Authorization: Bearer <key>' header.
`;
public static examples = ['$ balena api-key generate "Jenkins Key"'];
public static args = [
{
name: 'name',
description: 'the API key name',
required: true,
},
];
public static usage = 'api-key generate <name>';
public static flags: flags.Input<FlagsDef> = {
help: cf.help,
};
public static authenticated = true;
public async run() {
const { args: params } = this.parse<FlagsDef, ArgsDef>(GenerateCmd);
let key;
try {
key = await getBalenaSdk().models.apiKey.create(params.name);
} catch (e) {
if (e.name === 'BalenaNotLoggedIn') {
throw new ExpectedError(stripIndent`
This command cannot be run when logged in with an API key.
Please login again with 'balena login' and select an alternative method.
`);
} else {
throw e;
}
}
console.log(stripIndent`
Registered api key '${params.name}':
${key}
This key will not be shown again, so please save it now.
`);
}
}

Some files were not shown because too many files have changed in this diff Show More