Skip to content

Commit 326c6a9

Browse files
committed
docs(usage-view): how to use csv data export
Signed-off-by: Michal Wasilewski <michal@mwasilewski.net>
1 parent c0e72c6 commit 326c6a9

7 files changed

+91
-1
lines changed
Loading
Loading
Loading
Loading
Loading

docs/product/billing/usage.md

+89
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,89 @@
1+
---
2+
description: >-
3+
Describes how users can analyze their usage and billing data.
4+
---
5+
6+
# Usage
7+
8+
## Download Usage Data
9+
10+
Usage data can be downloaded:
11+
12+
- from the Usage page: Organization Settings -> Other -> Usage
13+
- using [spacectl](https://github.com/spacelift-io/spacectl) command: `spacectl profile usage-csv`
14+
- via the Spacelift API at `https://<your-spacelift-host>/usageanalytics/csv` (note: this is not a GraphQL endpoint)
15+
16+
## Analyzing Usage Data
17+
18+
### Load CSV data into PostgreSQL
19+
20+
Create a table for worker count and load the CSV file into the table:
21+
22+
```sql
23+
create table worker_count (
24+
id int,
25+
count int,
26+
timestamp_unix int,
27+
worker_pool_name varchar(255)
28+
);
29+
```
30+
31+
```shell
32+
psql -h <host> -U <user> -d <database> -c "\copy worker_count from '<path-to-csv-file>' delimiter ',' csv header"
33+
```
34+
35+
And the same for run minutes:
36+
37+
```sql
38+
create table run_minutes (
39+
timestamp_unix bigint,
40+
state_duration_minutes float,
41+
public boolean,
42+
run_state varchar(255),
43+
run_type varchar(255),
44+
run_ulid varchar(26),
45+
stack_name varchar(255),
46+
stack_slug varchar(255),
47+
stack_ulid varchar(26),
48+
is_stack boolean,
49+
worker_pool_name varchar(255),
50+
worker_pool_ulid varchar(26)
51+
);
52+
```
53+
54+
```shell
55+
psql -h <host> -U <user> -d <database> -c "\copy run_minutes from '<path-to-csv-file>' delimiter ',' csv header"
56+
```
57+
58+
### ELK stack
59+
60+
CSV files can be easily imported into [Elastic Stack](https://www.elastic.co/elastic-stack/) for visualization. The following example shows how to import your csv data into [Kibana](https://www.elastic.co/kibana) using [Data Visualizer](https://www.elastic.co/blog/importing-csv-and-log-data-into-elasticsearch-with-file-data-visualizer).
61+
62+
Steps:
63+
64+
1. Download the CSV file.
65+
66+
```shell
67+
spacectl profile usage-csv -aspect run-minutes > usage.csv
68+
```
69+
70+
Note: the exported CSV data doesn't contain zero values, so percentile calculations will be incorrect. To fix this, you can add zero values to the CSV file before processing it.
71+
72+
2. Open Kibana and go to Machine Learning
73+
74+
![](../../assets/screenshots/usage-view-csv-kibana-ml.png)
75+
76+
3. Select the CSV file.
77+
78+
![](../../assets/screenshots/usage-view-csv-kibana-visualize.png)
79+
80+
![](../../assets/screenshots/usage-view-csv-kibana-upload.png)
81+
82+
4. Use the mappings that were automatically generated by Kibana.
83+
84+
![](../../assets/screenshots/usage-view-csv-kibana-mappings.png)
85+
86+
5. Create a visualization using the imported data.
87+
88+
Here's an example of a visualization that shows the number of run minutes on private workers, broken down by the run state:
89+
![](../../assets/screenshots/usage-view-csv-kibana-analysis.png)

nav.yaml

+2-1
Original file line numberDiff line numberDiff line change
@@ -134,7 +134,7 @@ nav:
134134
- integrations/cloud-providers/oidc/azure-oidc.md
135135
- integrations/cloud-providers/oidc/vault-oidc.md
136136
- Observability:
137-
- integrations/observability/README.md
137+
- integrations/observability/README.md
138138
- integrations/observability/datadog.md
139139
- integrations/observability/prometheus.md
140140
- Source Control:
@@ -166,6 +166,7 @@ nav:
166166
- Billing:
167167
- product/billing/stripe.md
168168
- product/billing/aws-marketplace.md
169+
- product/billing/usage.md
169170
- ⚖️ Legal:
170171
- legal/terms.md
171172
- legal/refund-policy.md

0 commit comments

Comments
 (0)