CloudPulse Strategies CloudPulse Strategies CloudPulse Strategies CloudPulse Strategies CloudPulse Strategies

    On Community

    583 578 CloudPulse Strategies

    NOTE: This is an excerpt from my September 20th podcast newsletter. To sign up for the newsletter, visit: This Week in Machine Learning & AI Newsletter.

    After a calm and productive few weeks at home, conference season is back in full swing for the fall. This week, I’m at the O’Reilly/Intel Nervana Artificial Intelligence Conference in San Francisco. As much as travel can be a grind, I really get a kick out of engaging with the TWiML community in person—both interviewing guests and meeting up with listeners.

    Two of the listeners I met up with this time were Xinyu Hong and Richard Shen, the winners of our recent AI Conference ticket giveaway. Xinyu is a student at Yale, and Richard just finished up at Dartmouth, and is about to start a graduate program at Cambridge. While we didn’t specifically target students with this contest, it’s great to know the pod was able to give back in this way, and hearing their enthusiasm for machine learning and the conference was amazing!

    Now, as much as I love in-person meet-ups, our virtual meet-ups have been great too! For this month’s meetup last Tuesday, meetup member Nikola Kučerová presented Yoshua Bengio’s Learning Long-Term Dependencies with Gradient Descent is Difficult. This is an important paper that explores one of the key challenges with training recurrent neural nets—the vanishing gradient problem. It looks at various alternative approaches, and ultimately sets the stage for later advances like LSTM and GRU networks. Thanks for presenting, Nikola!

    The replay is posted. Check it out, and join the meetup, at twimlai.com/meetup.

    That’s it for now. Be sure to follow me on Twitter (@samcharrington & @twimlai) and Instragram (@twimlai) to keep up with me on my travels.

    Leave a Reply

    Your email address will not be published.