抱歉,您的浏览器无法访问本站

本页面需要浏览器支持(启用)JavaScript


了解详情 >

lose weight 💪🏻

小舟從此逝 江海寄餘生🧘 is inputting

If you really want to do it you can! Just keep on trying your best and practice whenever you have time.

Diligence is not a race against time, but continuous, dripping water wears through the rock.

Set your mind to it and you can do it !

1. VISA

image

Sr. Data Engineer

  • Java and Big Data technologies like Hive, Hadoop, and Spark
  • Understanding and working experience with shell scripting
  • Knowledge and working experience on Git/Stash, Ant, Maven, Jenkins and Jira
  • Experience with database technologies like DB2, Oracle, SQL Server
  • Knowledge of Unix/Linux
  • Strong foundation in computer science, with strong competencies in data structures, algorithms and software design optimized for building highly distributed and parallelized systems

2. JPMorgan

image

Senior Lead Data Engineer

Required Qualifications, Capabilities, And Skills

  • Working experience with both relational and NoSQL databases
  • Advanced understanding of database back-up, recovery, and archiving strategies
  • Advanced knowledge of linear algebra, statistics, and geometrical algorithms
  • Experience presenting and delivering visual data
  • Deep understanding of distributed systems and cloud technologies (AWS, GP, Azure, etc.)
  • Experience in the all stages of software development lifecycle (requirements, design, architecture, development, testing, deployment, release and support)
  • Experience with large scale datasets, data lake and data warehouse technologies on at least TB scale (ideally PB scale of datasets) with at least one of {BigQuery, Redshift, Snowflake}
  • Experience in leading a small team of technologists to manage and resolve technical items within expertise
    Preferred Qualifications, Capabilities, And Skills
  • Experience with a scheduling system (Airflow, Azkaban, etc.)
  • Understanding of (distributed and non-distributed) data structures, caching concepts, CAP theorem
  • Experience in automating deployment, releases and testing in continuous integration, continuous delivery pipelines
  • Experience with containers and container-based deployment environment (Docker, Kubernetes, etc.)
  • A solid approach to writing unit level tests using mocking frameworks, as well as automating component, integration and end-to-end tests

3. GovTech

image

Data Engineer(Quantitative Strategy)

  • A Bachelor’s Degree, preferably in Computer Science, Software Engineering, Information Technology, or related disciplines.
  • Deep understanding of system design, data structure and algorithms, data modelling, data access, and data storage.
  • Proficiency in writing SQL for databases such as Postgres, MSSQL.
  • Demonstrated ability in using cloud technologies such as AWS, Azure, and Google Cloud.
  • Experience with orchestration frameworks such as Airflow, Azure Data Factory.
  • Experience with distributed data technologies such as Spark, Hadoop.
  • Proficiency in programming languages such as Python, Java, or Scala.
  • Familiarity with building and using CI/CD pipelines.
  • Familiarity with DevOps tools such as Docker, Git, Terraform.
    Preferred requirements
  • Experience in architecting data and IT systems.
  • Experience in designing, building, and maintaining batch and real-time data pipelines.
  • Experience with Databricks.
  • Experience with implementing technical processes to enforce data security, data quality, and data governance.
  • Familiarity with government systems and government’s policies relating to data governance, data management, data infrastructure, and data security

4. Grab

image

Senior Data Engineer

  • Bachelor degree in Analytics, Data Science, Mathematics, Computer Science, Information Systems, Computer Engineering, or a related technical field
  • At least 3-4 years of experience developing Data warehouse and Business Intelligence solutions
  • Sound knowledge of data warehousing concepts, data modeling/architecture and SQL
  • Knowledge of programming languages such as Java, Scala, Python, etc.
  • Understanding of performance, scalability and reliability concepts
  • Experience with Big Data frameworks such as Hadoop, Spark, etc.
  • Experience with developing data solutions on AWS
  • Ability to drive initiatives and work independently, while being a team player who can liaison with various stakeholders across the organization
  • Excellent written and verbal communication skills in English

5. Stripe

image

6. Bytedance

image

Data Engineer - Global Payments

  • Bachelor’s degree or above in Computer Science, Statistics, Mathematics or other related majors;
  • At least 3 years of experiences and above;
  • Proficient in at least one programming language such as Python, Java, Scala, Go, etc., with a strong engineering background and interest in data;
  • Prior experience with writing and debugging data pipelines using a distributed framework (Hadoop/Spark/Flink);
  • Familiar with OLAP engines (Hive/ES/Clickhouse/Druid/Kylin/Doris etc.);
  • Familiar with data warehouse architecture, data modelling methods and data governance; enthusiastic about data mining, strong business understanding and abstraction capabilities;
  • Proficient in databases, strong SQL/ETL development ability;
  • Experience in real-time data warehouse development is preferred.

7. HoYoverse

image

Data Engineer (SDK)

  • Master at least one object-oriented programming language,such as Python/Java/Scala;
  • Good knowledge of data structure and algorithm foundation.
  • At least 3 years or above experience in big data processing projects;
  • In-depth knowledge in distributed real-time or batch data processing systems;
  • Proficient in SQL, have good SQL tuning experience, understand the basic principles and tuning of big data related components such as Hadoop/Hive/Spark/Kafka/Flink/Clickhouse;

8. Airwallex

image

Senior Data Engineer

  • Bachelor’s or Master’s degree in CS/CE/CIS, (or equivalent experience) with knowledge of Kotlin / Java / Scala / Python / SQL. Knowledge of Spring Boot, Spark, Flink, Hadoop, BigQuery, and Snowflake is preferred.
  • Ability to take ownership of designing, building, and operating distributed systems and establishing overarching data architecture.
  • Strong working knowledge of Real-time/Batch processing systems.
  • Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Strong analytic skills related to working with unstructured datasets.
  • Build processes supporting data transformation, data structures, metadata, dependency and workload management.

9. Meta

image

10. GIC

VP, Data Engineer (Private Market Solutions), Technology Group 15441

  • 4+ years of relevant experience in data engineering or backend development, and hands on experience in solution designing, software testing and production support
  • Experience or knowledge in one / many of the following technologies is advantageous:
  • Database & Big Data Platforms – Oracle, MS SQL, Snowflake, JDBC/ODBC
  • Programming and Scripting – Python, Java, REST API
  • AWS services – S3, Airflow, Glue, SQS, SNS
  • React.js and other JavaScript framework/libraries
  • Experience with Agile software development methodologies and practices such as Scrum, Kanban and Test-Driven Development
  • Familiarity with Private Markets data is desirable
  • Keen learner, independent problem solver with strong communication and interpersonal skills

Other

Comments