The design of Brain-Machine Interface (BMI) neural decoders that have robust performance in changing environments encountered in daily life activity is a challenging problem. One solution to this problem is the design of neural decoders that are able to assist and adapt to the user by participating in their perception-action-reward cycle (PARC). Using inspiration both from artificial intelligence and neurobiology reinforcement learning theories, we have designed a novel decoding architecture that enables a symbiotic relationship between the user and an Intelligent Assistant (IA). By tapping into the motor and reward centers in the brain, the IA adapts the process of decoding neural motor commands into prosthetic actions based on the user's goals. The focus of this paper is on extraction of goal information directly from the brain and making it accessible to the IA as an evaluative feedback for adaptation. We have recorded the neural activity of the Nucleus Accumbens in behaving rats during a reaching task. The peri-event time histograms demonstrate a rich representation of the reward prediction in this subcortical structure that can be modeled on a single trial basis as a scalar evaluative feedback with high precision.