Join Trend ‧ Join New Generation
趨勢科技 - 全球雲端資安領航者 / 全亞洲最大軟體公司 / 企業版圖橫跨五大洲 / 趨勢全球研發基地在台灣
===============================================================
AI is driving a transformative shift across industries, and as a global cybersecurity leader, Trend Micro is at the forefront of this evolution-integrating AI to ensure the secure exchange of digital information. We look for individuals with strong technical expertise and a forward-thinking mindset who leverage AI to challenge the status quo and build impactful solutions. By adopting AI-powered development workflows, we empower our engineers to deliver reliable, high-quality, and scalable products with greater velocity.
The objective of TrendLife is to deliver the best security, privacy, and anti-scam solutions for the consumer market, using our decades of experience to protect millions of individual users worldwide. Everyone has the opportunity to join different teams and share their knowledge and passion, contributing to real results for our
We are looking for a technically strong and business-minded Data Engineer who can bridge the gap between raw data and business decisions. You'll own data pipelines, BI dashboards, and AI-powered workflows on Databricks, while acting as a trusted advisor to business stakeholders. The ideal candidate actively leverages AI tools (LLM, Copilot, agents) to accelerate their own development velocity - not just build AI for others.
Responsibilities
Business Intelligence & Data Engineering
- Design, develop, and maintain high-quality ETL pipelines, scheduled jobs, and BI dashboards on Databricks (and cloud platforms such as AWS Redshift / BigQuery)
- Ensure data pipeline performance, reliability, and scalability to support business reporting and analytics needs
- Collaborate with product and engineering teams on data instrumentation and quality across all services
Business Consulting & Solution Design
- Deep-dive into business domain knowledge and regional context to understand underlying needs behind data requests
- Proactively propose data solutions and analytical approaches - not just fulfill requests, but challenge and improve them
- Define key metrics with business stakeholders and translate business questions into technical solutions
Data Evangelism & Stakeholder Enablement
- Guide business stakeholders on correct data interpretation and usage to support decision-making
- Identify potential data opportunities that business teams may not have articulated yet
AI-Augmented Development Productivity
- Actively adopt AI coding assistants and agent-based workflows to accelerate development - you should be faster and more consistent because of AI, not despite it
- Share and promote effective AI-assisted development patterns within the team
AI Development & Automation
- Develop and optimize AI Agents to automate customer profiling, operational analysis, and decision-making loops
- Architect AI workflows with a clear understanding of tool-calling, context engineering, and model selection tradeoffs (performance vs. cost)
- Leverage LLM orchestration tools (e.g., MCP, LangChain, Claude/Copilot CLI agents) to bridge enterprise data platforms with AI capabilities
Qualifications & Requirements
[Must Have]
- BS/MS in Computer Science, Information Engineering, or related field
- 3+ years of experience in cloud data engineering or backend development
- Moderate English reading/writing skills for cross-regional collaboration
- Hands-on experience developing ETL pipelines, jobs, and dashboards onDatabricks
- Strong RDBMS knowledge with solid SQL (MSSQL/PostgreSQL) programming and query optimization skills
- Ability to understand business context and translate ambiguous requirements into clear technical solutions
- Proficiency in Python data processing (pandas, PySpark, polars, numpy)
- Strong communication skills - comfortable engaging with both technical teams and business stakeholders
- Passion for using new technology to improve personal and team productivity
[Nice to Have]
- Proficiency in architecting robust AI Agents with deep understanding of skill/tool management, context window engineering, and cost-aware model selection
- Experience with LLM orchestration frameworks (MCP, LangChain) or AI-augmented development tools (GitHub Copilot, Claude Code, Cursor)
- Familiarity with AWS data services (Redshift, Glue, S3) or GCP equivalents
- Experience with CI/CD pipelines (GitHub Actions or equivalent)
===============================================================