About RL Custom Agent/ LQRCustomAgent example
3 views (last 30 days)
Show older comments
hieu nguyen
on 21 Apr 2023
Commented: hieu nguyen
on 24 Apr 2023
I am trying to create my own RL Custom Agent and I consulted LQRCustomAgent example which is provided by Matlab. Here is the link of the example:
In the learnImpl function, there is two inputs are obj and exp. However, I can't see they define exp in this example. Why they can still use it?. I mean where does the exp input come from? Is there any Matlab file or Abstract class define it?
Here is another example of Matab: https://www.mathworks.com/help/reinforcement-learning/ug/create-agent-for-custom-reinforcement-learning-algorithm.html
they still use learnImpl function with 2 inputs are obj and exp. However, the way they call each element in exp is difference with the previous example. What is meaning of each element calling way ?
I hope you can help answer my question. Thank you very much!
0 Comments
Accepted Answer
Emmanouil Tzorakoleftherakis
on 24 Apr 2023
Actually, exp is being indexed in exactly the same way. Only in the first example we are doing it in one line and in the second over two lines. When you index the cell array, you are getting back the specific element, which is not a cell array anymore, that's why you see the '{1}'.
Also, the exp cell array is being automatically created in the background, so you don't need to worry about it.
Hope this helps
More Answers (0)
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!