A few months ago I came across maximum mean discrepancy as a measure of distribution difference, and today I read this term and totally forgot what is means and had to find a youtube video to refresh my understanding. This happens a lot of times in my research. I feel like unless something is really basic (e.g. CNN, cross entropy, etc) and used a lot in my day-to-day model building, I easily forgot what I have read. I wonder is it just because I have a bad memory or I do not have a good way to organize information?

  • nikgeo25@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    There are lots of engineering tricks given fancy names in ML. No point in memorizing all of them, but keep reading them to maintain intuition.