LeoTungAnh commited on
Commit
d834455
1 Parent(s): 9ede10d

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +68 -0
README.md ADDED
@@ -0,0 +1,68 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: openrail
3
+ task_categories:
4
+ - time-series-forecasting
5
+ size_categories:
6
+ - 1K<n<10K
7
+ ---
8
+ **Download the Dataset**:
9
+ ```python
10
+ from datasets import load_dataset
11
+
12
+ dataset = load_dataset("LeoTungAnh/electricity_hourly")
13
+ ```
14
+
15
+ **Dataset Card for Electricity Consumption**
16
+
17
+ This dataset encompasses hourly electricity consumption in kilowatts (kW) across a span of three years (2012-2014), involving 370 individual clients in Portugal.
18
+
19
+ **Preprocessing information**:
20
+ - Grouped by hour (frequency: "1H").
21
+ - Applied Standardization as preprocessing technique ("Std").
22
+
23
+ **Dataset information**:
24
+ - Number of time series: 370
25
+ - Number of training samples: 26208
26
+ - Number of validation samples: 26256 (number_of_training_samples + 48)
27
+ - Number of testing samples: 26304 (number_of_validation_samples + 48)
28
+
29
+ **Dataset format**:
30
+ ```python
31
+ Dataset({
32
+
33
+ features: ['start', 'target', 'feat_static_cat', 'feat_dynamic_real', 'item_id'],
34
+
35
+ num_rows: 370
36
+
37
+ })
38
+ ```
39
+ **Data format for a sample**:
40
+
41
+ - 'start': datetime.datetime
42
+
43
+ - 'target': list of a time series data
44
+
45
+ - 'feat_static_cat': time series index
46
+
47
+ - 'feat_dynamic_real': None
48
+
49
+ - 'item_id': name of time series
50
+
51
+
52
+ **Data example**:
53
+ ```python
54
+ {'start': datetime.datetime(2012, 1, 1, 1, 0),
55
+
56
+ 'target': [-0.19363673541224083, -0.08851588245610625, -0.19363673541224083, ... -0.5615597207587115,...],
57
+
58
+ 'feat_static_cat': [0],
59
+
60
+ 'feat_dynamic_real': None,
61
+
62
+ 'item_id': 'MT_001'
63
+ }
64
+ ```
65
+
66
+ **Usage**:
67
+ - The dataset can be used by available Transformer, Autoformer, Informer of Huggingface.
68
+ - Other algorithms can extract data directly by making use of 'target' feature.