File size: 1,450 Bytes
841d7fc
 
907cc73
 
8ff33c6
841d7fc
 
 
 
 
 
 
 
8ff33c6
 
 
 
841d7fc
 
 
 
 
d3b42d5
 
841d7fc
8ff33c6
a369299
 
 
907cc73
 
a369299
8ff33c6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
382662a
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
# Running:

For now, just modify the script to your liking:

`JULIA_NUM_THREADS=8 julia paralleleureqa.jl`

## Modification

You can change the binary and unary operators in `eureqa.jl` here:
```
const binops = [plus, mult]
const unaops = [sin, cos, exp];
```
E.g., you can add another binary function with:
```
const binops = [plus, mult, (x, y)->x^2*y]
```

You can change the dataset here:
```
const nvar = 5;
const X = rand(100, nvar);
# Here is the function we want to learn (x2^2 + cos(x3))
const y = ((cx,)->cx^2).(X[:, 2]) + cos.(X[:, 3])
```
by either loading in a dataset, or modifying the definition of `y`.

### Hyperparameters

Turn on annealing by setting the following in `paralleleureqa.jl`:

`const annealing = true`

Annealing allows each evolutionary cycle to turn down the exploration
rate over time: at the end (temperature 0), it will only select solutions
better than existing solutions.

The following parameter, parsimony, is how much to punish complex solutions:
`
const parsimony = 0.01
`

Finally, the following
determins how much to scale temperature by (T between 0 and 1).
`
const alpha = 10.0
`
Larger alpha means more exploration.

One can also adjust the relative probabilities of each mutation here:
```
weights = [8, 1, 1, 1]
```
(for: 1. perturb constant, 2. mutate operator,
3. append a node, 4. delete a subtree).


# TODO

- Record hall of fame
- Optionally migrate the hall of fame, rather than current bests