|
| 1 | +--- |
| 2 | +description: >- |
| 3 | + Describes how users can analyze their usage and billing data. |
| 4 | +--- |
| 5 | + |
| 6 | +# Usage |
| 7 | + |
| 8 | +## Download Usage Data |
| 9 | + |
| 10 | +Usage data can be downloaded: |
| 11 | + |
| 12 | +- from the Usage page: Organization Settings -> Other -> Usage |
| 13 | +- using [spacectl](https://github.com/spacelift-io/spacectl) command: `spacectl profile usage-csv` |
| 14 | +- via the Spacelift API at `https://<your-spacelift-host>/usageanalytics/csv` (note: this is not a GraphQL endpoint) |
| 15 | + |
| 16 | +## Analyzing Usage Data |
| 17 | + |
| 18 | +### Load CSV data into PostgreSQL |
| 19 | + |
| 20 | +Create a table for worker count and load the CSV file into the table: |
| 21 | + |
| 22 | +```sql |
| 23 | +create table worker_count ( |
| 24 | + id int, |
| 25 | + count int, |
| 26 | + timestamp_unix int, |
| 27 | + worker_pool_name varchar(255) |
| 28 | +); |
| 29 | +``` |
| 30 | + |
| 31 | +```shell |
| 32 | +psql -h <host> -U <user> -d <database> -c "\copy worker_count from '<path-to-csv-file>' delimiter ',' csv header" |
| 33 | +``` |
| 34 | + |
| 35 | +And the same for run minutes: |
| 36 | + |
| 37 | +```sql |
| 38 | +create table run_minutes ( |
| 39 | + timestamp_unix bigint, |
| 40 | + state_duration_minutes float, |
| 41 | + public boolean, |
| 42 | + run_state varchar(255), |
| 43 | + run_type varchar(255), |
| 44 | + run_ulid varchar(26), |
| 45 | + stack_name varchar(255), |
| 46 | + stack_slug varchar(255), |
| 47 | + stack_ulid varchar(26), |
| 48 | + is_stack boolean, |
| 49 | + worker_pool_name varchar(255), |
| 50 | + worker_pool_ulid varchar(26) |
| 51 | +); |
| 52 | +``` |
| 53 | + |
| 54 | +```shell |
| 55 | +psql -h <host> -U <user> -d <database> -c "\copy run_minutes from '<path-to-csv-file>' delimiter ',' csv header" |
| 56 | +``` |
| 57 | + |
| 58 | +### ELK stack |
| 59 | + |
| 60 | +CSV files can be easily imported into [Elastic Stack](https://www.elastic.co/elastic-stack/) for visualization. The following example shows how to import your csv data into [Kibana](https://www.elastic.co/kibana) using [Data Visualizer](https://www.elastic.co/blog/importing-csv-and-log-data-into-elasticsearch-with-file-data-visualizer). |
| 61 | + |
| 62 | +Steps: |
| 63 | + |
| 64 | +1. Download the CSV file. |
| 65 | + |
| 66 | + ```shell |
| 67 | + spacectl profile usage-csv -aspect run-minutes > usage.csv |
| 68 | + ``` |
| 69 | + |
| 70 | + Note: the exported CSV data doesn't contain zero values, so percentile calculations will be incorrect. To fix this, you can add zero values to the CSV file before processing it. |
| 71 | +
|
| 72 | +2. Open Kibana and go to Machine Learning |
| 73 | +
|
| 74 | +  |
| 75 | +
|
| 76 | +3. Select the CSV file. |
| 77 | +
|
| 78 | +  |
| 79 | +
|
| 80 | +  |
| 81 | +
|
| 82 | +4. Use the mappings that were automatically generated by Kibana. |
| 83 | +
|
| 84 | +  |
| 85 | +
|
| 86 | +5. Create a visualization using the imported data. |
| 87 | +
|
| 88 | + Here's an example of a visualization that shows the number of run minutes on private workers, broken down by the run state: |
| 89 | +  |
0 commit comments