Company Description Jobs for Humanity is dedicated to building an inclusive and just employment ecosystem. Therefore, we have dedicated this job posting to individuals coming from the following com
Jobs for Humanity is dedicated to building an inclusive and just employment ecosystem. Therefore, we have dedicated this job posting to individuals coming from the following communities: Refugee, Neurodivergent, Single Parent, Blind or Low Vision, Ethnic Minority, and the Previously Incarcerated. If you identify with any of the following communities do not hesitate to register, even if you feel that this particular opportunity is not the right fit for you.
Company Name: Binance
Are you looking to be a part of the most influential company in the blockchain industry and contribute to the crypto-currency revolution that is changing the world?
Binance is looking to expand it’s data analysis team to research, analyse and improve the understanding our data to even greater lengths. This provides valuable insight which can be harnessed to help guide business decisions for all areas including compliance, security, product and marketing.
- Responsible for data warehouse construction, data access/data modeling/data service, etc.
- Participated in the architecture design, development, release, operation and maintenance of the company’s real-time computing platform
- Responsible for data collection, metadata extraction, data cleaning, data modeling, api development, etc., and building data links
- Responsible for the development iteration and code quality of data management tool set SDK, reduce the difficulty of data management and improve the automation degree of data link
- Responsible for annotation operation analysis, event tracking analysis, data set analysis and other data analysis work, and built unified data warehouse
- Responsible for docking with the research team to solve relevant data cleaning, data modeling, data analysis and other requirements
- Major in computer science, bachelor degree or above, 3+ years working experience is preferred. Grading based on ability and experience
- Solid computer knowledge, systematic understanding of operating system, database, data structures, etc
- Skilled in Java or Golang, Python, more than one programming language
- Master big data ecological technology stack (HDFS, Hive, Elasticsearch, HBase, Impala, Spark/Flink, Kafka, Airflow, Sqoop, etc.), have rich experience in the application and development of big data tools such as Hadoop/ HBase /Hive/Flink
- Experience in troubleshooting and tuning, studying component source code is preferred
- Solid SQL skills, understand the principle of SQL execution under different frameworks, familiar with structured and unstructured analysis tools of big data, and have rich practical experience
- Rich experience in big data development, including but not limited to data acquisition system, data cleaning, real-time analysis system, multi-service data warehouse, etc.
- Strong learning ability and problem solving ability, able to quickly grasp business knowledge and solve technical problems
- Fluent in English
- Do something meaningful; Be a part of the future of finance technology and the no.1 company in the industry
- Fast moving, challenging and unique business problems
- International work environment and flat organisation
- Great career development opportunities in a growing company
- Possibility for relocation and international transfers mid-career
- Competitive salary
- Flexible working hours, Casual work attire
By submitting a job application, you confirm that you have read and agree to our Candidate Privacy Notice.