[mlpack] Reinforcement Learning
ee15b108 at smail.iitm.ac.in
Mon Mar 6 12:47:18 EST 2017
I am sorry that i misunderstood the description , thanks a lot for
clarifying it. I also referred to the thread in February ,
The cross-validation and hyper-parameter tuning project is pretty new,
and there is not much in the way of existing bugs that will help
understand it since the project involves generating a completely new
piece of code for mlpack.
I just opened some issues for the decision tree code today; maybe you
can find one of those interesting?
(the top 5 are related to decision trees, at least when I wrote this
I think one approach would be to use the various different classifiers
and functionality inside of mlpack, and then write some simple C++
programs to do cross-validation or hyper-parameter tuning by hand.
Then, this could help make it more clear what the needs of the
hyper-parameter tuning module and cross-validation module would be.
Please correct me if i am wrong , i have read the API documentation for
Linear Regression,KNN as of now,
So i can try using other mlpack methods , read their API docs and try to
understand the working of the code from here.
As you mentioned in that thread , i will take my time to understand the
different mlpack methods and then write C++ code to perform
Cross-Validation and gain more insight onto what all would go into
designing a generic module to handle various types of models , choosing
loss function for them etc.
I am familiar with Template MetaProgramming from my Object-Oriented
Programming Lab course in my college.I saw a few links regarding the same
on mlpack gsoc page and will read through that too.
Also I'll take your advice and not try reading too much at one go.
Indian Institute of Technology - Madras.
On Mon, Mar 6, 2017 at 7:58 AM, Ryan Curtin <ryan at ratml.org> wrote:
> On Sun, Mar 05, 2017 at 12:23:34PM +0530, S.NARAYAN ee15b108 wrote:
> > I pulled an all nighter
> I know this point is a little off topic here, but in my opinion this is
> not a great way to ingest the material! Get some sleep and take it
> slow; the projects for mlpack tend to be quite complex and the necessary
> material needs to be understood in-depth, which (at least for me)
> couldn't be done in a single sleepless night.
> > As its a new one , no relevant issues exist as of now so I would like to
> > start by adding a Cross-Validation module to Linear Regression (will
> > working on it now) , and once i get accustomed to ML-Packs coding style
> > API designs i can add further modules for other training algorithms and
> > also optimize my approaches (through better pre-computation etc ).
> This isn't really what the project is about, unfortunately. Please
> carefully read the description. We want to create a generic
> cross-validation module that can work with any mlpack classifier. So
> the place to start is not to graft support onto the mlpack linear
> regression implementation but instead carefully consider the set of
> classifiers that mlpack implements and determine a way to create a
> generic cross-validation module that can support them all.
> The cross-validation module will probably need to use some amount of
> template metaprogramming so you probably want to spend some time
> familiarizing yourself with that also.
> I believe that I have written an email to the mailing list about this
> project (I haven't gone through the effort to find it right now); I'd
> suggest searching the mailing archives to find it. This could help
> clarify what the project is all about.
> Ryan Curtin | "Lady, I'm gonna have to ask you to leave the store."
> ryan at ratml.org | - Ash in Housewares
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the mlpack