Select Git revision
mindBody.md
-
Amira Abdel-Rahman authoredAmira Abdel-Rahman authored
mindBody.md 2.69 KiB
Shape and Control Optimization
Research and development of workflows for the co-design reconfigurable AI software and hardware.

Weight Agnostic Neural Networks (WANN)
Introduction
-
Paper
- "focus on finding minimal architectures".
- "By deemphasizing learning of weight parameters, we encourage the agent instead to develop ever-growing networks that can encode acquired skills based on its interactions with the environment".


1- Frep Search
As a first step to understand the code and WANN training, I implemented a toy problem where I am trying to learn the functional representation of target image, which is, given the x,y position of every pixel in an image, try to find the distance function that represent how far is this pixel from the edge of the shape.
In the following training the target shape is a circle, and the input is the x and y positions of the pixels, and it found a minimal neural network architecture (given a library of given non linear functions) that maps the input position into the target shape.
Graph Evolution:

Target Evolution:

2 - Walker

- Variables
- Modularity and Hierarchy
- Number of Variables
- Similar to rover, I can explore the benefit of hierarchy and modularity by taking advantage of the symmetry and hierarchy in the problem
- Each leg has 2 independent degrees of freedom
- Number of Variables
- Modularity and Hierarchy
- Objective function
- this problem will be a bit harder as it has to learn ro stand first then walk
- I can try to make it easier by restraining some degrees of freedom
- I want to explore if the graphs of both behaviors are related or not

Progress and Results:
- Integration:
- Done with integrating WANN with MetaVoxels, only need to load libraries/functions once and then call them each time I do a simulation in parallel
- Training
- Objective function:
- any voxel with maximum x
- Training Results:


