Nithin Sai Ram Vasamsetti

India

@nithin_270103

Badges

Python

Certifications

nithin_270103 has not earned any certificates yet.

Work Experience

  • Software Engineer

    other•  May 2023 - July 2023

    By employing knowledge distillation, the focus is indeed on training a smaller student CNN model to match or surpass the accuracy of a larger teacher CNN model. The concept involves transferring knowledge from the teacher model to the student model, enabling the student model to achieve comparable performance while being more computationally efficient.

Education

  • NIT Andhra Pradesh

    Mechanical Engineering, B.Tech•  August 2020 - April 2024

    I am available to work from Jan2024

Skills

nithin_270103 has not updated skills details yet.