The main problem with the new design is that it makes the decision tree AI pretty much unnecessary. It is, in fact, too close to the original design still to properly make use of the AI that is supposed to be included as the main technological effort for this project. The original design was meant to be more of a demo for a particular company I wanted to apply to, but now that I am no longer interested in applying, it seems like accomplishing the technological requirements at the cost of design may be best. However,
Elemental alignment is still based on environment, although this need not be in the initial demo. Creatures should have the following desires:
get food (low food/health bar)
violence (low violence bar)
explore (low explore bar)
rates of emptying is determined by nature.
training data = need to meet (allow ‘or none’ somehow)
stimulus = user coaxing; less rewarding things are less likely
how do we get it to try things when presented with stimulus?
you click one of two commands to it, it tries something nearby
train to go to or away from the cursor
..click, button commands –> it tries things; user action is recorded as ‘stimulus’ in association with the goodness reinforcement (if any)
..desire needs filling –> it tries things, picking the right action; need is recorded as ‘desire’ in association with the goodness reinforcement (if any)
..whenever update data, check if entry exists or not
have ‘find closest’ decision?
OR just have the training include giving treats.