返回查询:Big Data / 安徽

Job Description
Responsible for the development and delivery of data integration requirements for big data in global enterprises

Hybrid ETL architecture/platform/tool design, development, optimization and operation

Ensure high stability, high performance, high scalability and high quality of the global data collection architecture within the enterprise

Requirement

  • More than 3 years of development experience in NiFi, Python, Java/.Net, Go, etc., DevOps/CICD/SRE experience is a plus
  • Independently responsible for ETL/ELT system, big data synchronization/replication design and development, tuning and operation
  • Familiar with RDB (Oracle, MySQL/PostGreSQL, SQL Server...) SQL development, performance tuning experience
  • Responsible for platform tool management, familiar with linux command, experience in setting up management server software/tool software on VM, K8S, Docker
  • Practical experience in Hadoop architecture is a plus, and practical experience in domestic/foreign cloud big data services-Ali, Tencent, GCP, AWS, Azure is a plus
  • Full-time computer, big data and information management and other majors

More than 5 years of work experience after undergraduate graduation

More than 3 years of master's degree

  • Good communication and expression/text output skills, good English skills
  • Active and positive work, careful and meticulous, and experience in global teamwork is a plus