Client:
Tether Operations Limited
Location:
Dublin, Ireland
Job Category:
Other
EU work permit required:
Yes
Posted:
24.06.2025
Expiry Date:
08.08.2025
Job Description:
Join Tether and Shape the Future of Digital Finance
At Tether, we’re pioneering a global financial revolution with products that enable seamless, secure, and instant digital transactions across blockchains. Our solutions support businesses like exchanges, wallets, and ATMs in using reserve-backed tokens, promoting transparency and trust in digital finance.
Our Offerings:
* Tether Finance: Home to the trusted stablecoin USDT and digital asset tokenization services.
* Tether Power: Eco-friendly energy solutions for Bitcoin mining, utilizing state-of-the-art facilities.
* Tether Data: Cutting-edge AI and data sharing applications like KEET.
* Tether Education: Digital learning initiatives for individuals in the digital and gig economies.
* Tether Evolution: Innovating at the intersection of technology and human potential.
Why Join Us?
Our global, remote team is passionate about fintech innovation. If you excel in English communication and want to contribute to a leading platform, Tether is your place to grow and make an impact.
About the Role:
You will be part of our AI model team, focusing on optimizing model deployment and inference for advanced AI systems. Your work involves developing scalable, efficient, and responsive inference architectures suitable for diverse environments, including resource-limited devices.
Responsibilities:
* Design and deploy high-performance model serving architectures optimized for low latency and memory efficiency.
* Set and track performance metrics such as response time, throughput, and memory usage.
* Conduct inference testing in simulated and real environments, analyzing results to improve system performance.
* Develop test datasets and scenarios for real-world deployment, especially on low-resource devices.
* Identify bottlenecks and optimize serving pipelines for scalability and reliability.
* Collaborate with cross-functional teams to integrate solutions into production environments, ensuring continuous improvement.
Qualifications include a degree in Computer Science or related fields, with a PhD preferred. Proven experience in inference optimization on mobile devices, kernel development, and model serving frameworks is essential. Strong research background with publications in reputable conferences is advantageous. Candidates must have expertise in CPU/GPU kernel programming, inference pipeline development, and deploying models on resource-constrained devices.
Please note: If you do not hold a passport for the country of the vacancy, a work permit may be required. For more information, visit our Blog.
Ensure applications are submitted via the 'Apply now' button. Do not include bank or payment details in your application. Eurojobs.com is not responsible for external content.
#J-18808-Ljbffr