• 内容资讯,短视频 / D轮及以上 / 2000人以上
    职位职责: 成为全球商业化团队的一员并参与驱动海外业务增长,接触从广告产品到客销关系全链路业务流程并参与数据体系建设,赋能业务。 Be part of the monetization team and contribute to global expansion. Chance to get a full view of monetization products and business operations and participate in building data assets to drive business value. 1、负责面向海外业务梳理业务运营中的数据流程,输出数据产品以及框架性的数据洞察; 2、在深入理解产品细节,业务流程、充分调研同类产品的基础上,以务实创新的原则探索最佳实践并推动落地; 3、亲自动手跑通数据流程,与研发协作,输出高质量的产品原型文档; 4、与相关团队充分协作,充分理解业务难点,把握短期和长期实现目标,制定清晰合理有共识的路线图,通过短期策略支持业务,同时推动长期产品迭代开发和上线。 1. Responsible for streamlining business workflow, delivering data products and data insights frameworks for overseas monetization business; 2. Define best practices based on a deep understanding of business workflow and product details through analysis of industry-leading products, be practical and creative; 3. Hands on the prototype of the data workflow. Collaborate with the RD team. Produce high quality product design documents; 4. Align with stakeholders, fully understand the pain point of the business, deal with both short-term and long-term solutions, and come up with a solid roadmap, support the business with short-term fix and strategy. At the same time, drive long-term product design iterations and product releases. 职位要求: 1、大学本科及以上学历,3年以上产品工作经验,2年以上数据产品相关经验; 2、熟练掌握SQL查询和Debug原数据的能力; 3、熟知数据分析框架,能将场景需求拆解成逻辑清晰的数据产品体系; 4、能独立发现问题并提出解决方案; 5、高质量产品文档及原型产出能力; 6、流畅的英文口语和书面表达; 7、有数据分析,数据科学,数据工程,统计分析相关经验优先。 1. Bachelor's degree or above, 3+ years experience in product management, 2+ years in data products area; 2. Great SQL capability for data processing and debugging; 3. Familiar with the data analysis framework, can break complex business requirements into systematical data systems; 4. Ability to execute independently; 5. Ability to produce high quality PRD and prototypes; 6. Fluent in English; 7. Experience in data analysis, data scientist, data engineering, statistics, analytics is a plus.
  • 内容资讯,短视频 / D轮及以上 / 2000人以上
    职位职责: 我们正在寻找AI/ML技术的技术专家,包括计算机视觉(CV),自然语言处理(NLP)和音频信号处理。您将负责与各种利益相关者(产品、运营、政策和工程)合作,并开发最先进的模型。 1、利用自然语言处理、机器学习或计算机视觉等内容理解能力设计和构建产品核心能力,提取数据洞察并优化变现策略; 2、基于最新的深度学习、机器学习、统计和优化技术的算法开发创新性解决方案并构建业务问题原型; 3、从0到1管理数据项目,并与产品经理协作定义用户故事和成功指标来指导开发过程; 4、使用不限于AB测试等方法验证项目的商业价值和预计收益; 5、与工程团队合作部署数据模型并将解决方案规模化。 We are looking for generalists and specialists in AI/ML techniques, including computer vision (CV), natural language processing (NLP), and audio signal processing. You will be responsible for partnering with a variety of stakeholders (product, operations, policy, and engineering) and developing state-of-the-art models. 1. Design and build core capabilities by leveraging content understanding; capabilities, such as natural language processing, machine learning, or computer vision, to extract insights and improve monetization strategies; 2. Develop creative solutions and build prototypes for business problems using algorithms based on the latest deep learning, machine learning, statistics, and optimization techniques; 3. Independently manage data projects from 0 to 1, and collaborate with product managers to define user stories, and success metrics to guide the development process; 4. Verify the business value and estimated revenue of the project using methods such as AB testing; 5. Collaborate with engineering teams to deploy and scale data science solutions. 职位要求: 1、了解统计学,机器学习和分析的基本数学基础知识; 2、至少 3 年数据分析经验,具有 ML/DL 和 CV/NLP/Speech 之一的行业经验; 3、具有探索性数据分析、统计分析和假设检验以及模型开发的经验; 4、精通 SQL、Hive、Presto 或 Spark,并具有处理大型数据集的经验; 5、熟练掌握Python和SQL,以及tensorflow、pytorch等ML/DL框架; 6、清楚地了解数据pipeline、模型开发、模型测试和部署; 7、有 CI/CD(如 git)和云服务(如 AWS/GCP/Azure)的经验者优先; 8、具备英文沟通能力强;能够以易于理解的方式向技术/非技术同事传达分析和技术内容; 9、具有求知欲以及出色的解决问题和量化能力,包括拆解问题、找出根本原因并提出解决方案。 1. Knowledge of underlying mathematical fundamentals in statistics, machine learning and analytics; 2. At least 3 years experience in data modeling/analysis, with industry experience in ML/DL and one of CV/NLP/Speech; 3. Experience with exploratory data analysis, statistical analysis and hypothesis testing, and model development; 4. Fluency in SQL, Hive, Presto, or Spark and having experience working with large datasets; 5. High proficiency in Python and SQL, and ML/DL frameworks such as tensorflow, pytorch; 6. Clear understanding of data pipeline, model development, model testing and deployment; 7. Experience in CI/CD such as git and cloud services such as AWS/GCP/Azure will be highly desirable; 8. Advanced English with good communications skills; able to communicate analytical and technical content in an easy to understand way to both technical and non-technical audiences; 9. Intellectual curiosity, along with excellent problem-solving and quantitative skills, including the ability to desegregate issues, identify root causes and recommend solutions.
  • 17k-30k·13薪 经验1-3年 / 本科
    旅游|出行 / D轮及以上 / 500-2000人
    职位职责 沟通理解业务场景,基于业务需求,搭建业务分析指标体系; 设计可视化监控看板与数据报告,如周报/月报,提升数据洞察效率,助力业务增长; 负责业务日常分析工作,包括用户行为,转化率,订单和营收等指标的拆解&诊断; 定期输出专项数据分析和策略迭代建议,如转化率分析,商品质量分析等项目,支撑业务目标达成; 数据质量监控(如埋点数据校验),协调/推动产研团队修复数据; 任职要求 **本科及以上学历,统计学、数学、计算机、经济学、商科相关专业优先 2-4年数据分析/商业分析经验(互联网/海外电商领域从业者优先) 熟练使用SQL进行数据提取,掌握Python基础数据处理能力 精通Tableau等可视化工具搭建数据看板,有通过AI提高报表搭效率/SQL质量项目加分 出色的stakeholder管理能力,善于沟通和表达,有责任心,具备较强的协调能力
  • 25k-40k 经验10年以上 / 本科
    教育 / 不需要融资 / 500-2000人
    岗位职责: Job Title: Head (Data Intelligence Department) Department: Data Intelligence Department, HKUST(GZ) As a higher education institution committed to innovation and data-driven decision-making, HKUST(GZ) is establishing a new Data Intelligence Department aimed at harnessing the power of data to enhance operational efficiency and strategic growth across all levels of the organization. We are seeking a visionary leader to head this critical initiative. The ideal candidate will possess a blend of strategic vision, technical expertise, and leadership skills to build and nurture a high-performing team dedicated to data governance, data analytics and business intelligence. Duties 1. Drive cooperation among various departments to develop and implement a university-wide data governance strategy that aligns with the University's mission and strategic goals. 2. Establish the framework of data governance and develop long-term plans for the development of AI/HPC Data Center and Smart Campus, and promote the utilization of computing technologies and improvement of energy efficiency to achieve the vision of a "Sustainable Smart Campus". 3. Develop and promote the implementation of campus data governance policies, including the standards of the allocation and usage of computing resources and smart campus services, policies for the entire data lifecycle management as well as category-based and class-based data security and protection. 4. Lead the innovation in the management and services of the AI/HPC Data Center to meet the computational demands of research/teaching and future development of the society and enhance the university’s scientific research capabilities by improvement of operational efficiency and service capabilities. 5. Oversee the operation and intelligent upgrades of Smart Campus systems including multimedia and research/teaching facilities, office and life services, security access, and smart buildings, ensuring efficient operation and interaction of the systems, and upgrading the design of intelligent campus services. 6. Supervise the compliance of data to ensure it meets both domestic and international data protection laws and standards,ethical norms, and the university’s data confidentiality requirements. 7. Guide the data teams from various University's departments to establish and optimize data processing workflows, foster a culture of data-driven decision-making, promote data accumulation and then to form data assets and driving continuous institutional improvement through the strategic use of data across the University. 任职要求: A qualified candidate should: 1. Hold a bachelor's degree or above in Data Science, Computer Science, or a related field. 2. Have at least 10 years of relevant work experience, including data management and team leadership in large organizations. 3. Possess excellent communication and teamwork skills, able to collaborate across departments to promote the implementation of data policies. 4. Be familiar with domestic and international data security and compliance policies and related regulations, such as China's Data Security Law, China's Cybersecurity Law, etc., with experience in handling data compliance in multiple jurisdictions. 5. Have strong strategic planning, organizational management, innovation management, and change implementation capabilities. This is a Mainland appointment, and the appointee will be offered a contract by HKUST(GZ) entity in accordance with the Mainland labor laws and regulations. Starting salary will be commensurate with qualifications and experience.
  • 35k-45k·13薪 经验3-5年 / 博士
    数据服务,软件开发 / 不需要融资 / 150-500人
    Alvanon HK, Ltd. As the leading global apparel business and product development consultancy, we help our clients grow sales and improve profitability by aligning their internal teams and strategic suppliers, engaging consumers, and implementing world class innovations throughout their product development processes and supply chains. We sincerely invite analytical, energetic and self-motivated individuals to join Alvanon. We are looking for candidates for the following position: Machine Learning/Data Science Manager You will be working at our head office in Hong Kong, developing new products relating tobody fit and fashion technology using engineering and machine learning techniques. You will be leading the machine learning team, working closely with other stakeholders to create innovative solutions and products with the latest machine learning technologies. Responsibilities ● Lead and drive the team to deliver AI/ML solutions. ● Identify and communicate with key stakeholders and understand their requirements. ● Research and develop AI/ML solutions that have real impact on our business/products. ● Implement machine learning/deep learning models and bring them into production. ● Work with data scientists and software developers through the entire R&Dpipeline (problem analysis, idea generation, data collection, model prototyping, evaluation, implementation, deployment and maintenance). ● Maintain and improve in-house interface for application prototype, model visualization and evaluation. Requirements ● Doctor Degree in Computer Science/Information System/Statistics or any related quantitative disciplines ● A minimum of 5 years of working experience in the relevant field ● Understanding of common machine learning models/algorithms, such as SVM, logistic regression, Bayesian regression, Neural Networks, etc. ● Basic understanding of probability, statistics, optimization, data structure and algorithms ● Hands-on experience in machine learning/deep learning tools, such as PyTorch, TensorFlow, Keras ● Strong communication skills in English, Mandarin and Cantonese ● Research experience in AI/ML/CV/CG in companies or university research labs is a plus Personality ● Have effective communication skills ● Have sense of ownership of work and desire to deliver great products To apply, please
  • 25k-40k 经验5-10年 / 本科
    旅游|出行 / D轮及以上 / 500-2000人
    We are seeking a data professional with combined expertise in data analysis and data warehousing to join our dynamic team. The ideal candidate will focus on leveraging data analysis to drive overall data initiatives. This role will be responsible for designing and optimizing data models, handling complex datasets, and improving data quality and processing efficiency. We expect the candidate to work independently while also collaborating closely with the team as needed to drive data-driven business growth. Key Responsibilities: - Design and maintain data warehouse architecture to support various business lines, including but not limited to Attractions, Mobility, and Hotels. - Develop a deep understanding of each business line and use data analysis to support business decisions and strategy development. - Build and optimize data models to ensure data accuracy and reliability. - Independently handle and optimize complex datasets, enhancing data quality and processing workflows. - Collaborate closely with both business and technical teams to ensure data solutions meet business requirements. - Write technical documentation and maintain the data warehouse’s data dictionary. Qualifications: - Over 5 years of experience in the data warehouse field. - Proficient in SQL and database technologies, with hands-on experience managing large-scale databases. - Strong experience in data model construction, with the ability to independently design and optimize complex data models. - Extensive experience in data quality and underlying data processing, with the ability to effectively resolve data issues. - Familiarity with DBT and practical experience is preferred. - Strong analytical thinking and problem-solving skills, with the ability to complete projects independently. - Excellent communication and teamwork skills, capable of effectively interacting with team members from diverse backgrounds.
  • 35k-50k 经验5-10年 / 本科
    其他 / 不需要融资 / 2000人以上
    Do you have experience in architecting data at a global organization scale? Do you enjoy working on cutting edge technologies while supporting end-users achieving their business goals? About QIMA You will be linking our new companies in Americas, our teams in Europe (mostly France) and our teams in Asia (China, Hong Kong, Philippines). Your role QIMA has a 35%/year growth pace, 20% thanks to acquisitions. It is paramount that we manage to integrate quickly our new acquired companies so that they can extend the benefit of our state-of-the-art data management & dashboards to our new clients and colleagues. Data integration plays a key role in this integration: how do we manage to understand quickly and unambiguously the data of the newly acquired company? How do we connect this data to our existing data flows? Data plays a key role at QIMA: as we master the entire data flow, from data collection (with our own inspectors, auditors, and labs), data processing (BI and data scientists) and data actionability (insights for our customers and for our managers). Data is spread around different departments and expertise inside the company: Marketing, Operations, Sales, IT, … Data governance is key, and collaboration around data will unlock the potential to bring even more value to our customers about the quality of their products, and to our managers about their operations. These main challenges about data lead us to look for our Head of Data. In this role, your main responsibilities will be, but not limited to: - Project Management o Imagine the business cases of the data projects by exchanging with stakeholders, and deliver them o Lead the transversal projects around our datawarehouse and cloud ETL o Lead the Master Data Management projects, leveraging the key skills and technologies already in place across departments o Drive the data integration of newly acquired companies within QIMA group in order to synchronize reporting dashboards and provide a transversal understanding of the business o Lead the discussion and integration projects with external partners o Track results and provide continuous improvement o Be responsible for budget, roadmap, quality and delivery of these projects - People Management o Manage the Data Engineering and the Business Intelligence teams - Community Animation o Animate data governance across domains and departments o Be the guardian of data quality in the group, challenge data inconsistencies and ensure that data is shared by all departments in any circumstances o Implement knowledge sharing practices inside the data community o Be responsible of data lineage and data quality - Management of the run o Cooperate with our IT support organization to create the support of the newly created systems o Organize and manage the day-to-day operation and support of this system Requirements: In order to succeed in this role, you must have: - Master's degree in computer science - Extensive experience and knowledge on Data solution architecting - Experience in transversal projects - Are a hands-on person, autonomous, at ease to discuss with a CEO or a field operator - Open minded, agile with change and pragmatic - Ability to drive a workstream and train final users - Can work in a multinational environment and on multiple simultaneous projects - Strong communication skills, both oral and written - Excellent teamwork and interpersonal skills - Are fluent in English: daily use required with our colleagues all over the world; - If you are based in Europe, you are willing and able to travel. We offer: a competitive package, performance bonus, fast career progression and international career opportunities.
  • 45k-65k 经验5-10年 / 本科
    其他 / 不需要融资 / 2000人以上
    Your role QIMA has a 35%/year growth pace, 20% thanks to acquisitions. It is paramount that we manage to integrate quickly our new acquired companies so that they can extend the benefit of our state-of-the-art data management & dashboards to our new clients and colleagues. Data integration plays a key role in this integration: how do we manage to understand quickly and unambiguously the data of the newly acquired company? How do we connect this data to our existing data flows? Data plays a key role at QIMA: as we master the entire data flow, from data collection (with our own inspectors, auditors, and labs), data processing (BI and data scientists) and data actionability (insights for our customers and for our managers). Data is spread around different departments and expertise inside the company: Marketing, Operations, Sales, IT, … Data governance is key, and collaboration around data will unlock the potential to bring even more value to our customers about the quality of their products, and to our managers about their operations. These main challenges about data lead us to look for our Head of Data. In this role, your main responsibilities will be, but not limited to: -Project Management oImagine the business cases of the data projects by exchanging with stakeholders, and deliver them oLead the transversal projects around our datawarehouse and cloud ETL oLead the Master Data Management projects, leveraging the key skills and technologies already in place across departments oDrive the data integration of newly acquired companies within QIMA group in order to synchronize reporting dashboards and provide a transversal understanding of the business oLead the discussion and integration projects with external partners oTrack results and provide continuous improvement oBe responsible for budget, roadmap, quality and delivery of these projects -People Management oManage the Data Engineering and the Business Intelligence teams -Community Animation oAnimate data governance across domains and departments oBe the guardian of data quality in the group, challenge data inconsistencies and ensure that data is shared by all departments in any circumstances oImplement knowledge sharing practices inside the data community oBe responsible of data lineage and data quality -Management of the run oCooperate with our IT support organization to create the support of the newly created systems oOrganize and manage the day-to-day operation and support of this system Requirements: In order to succeed in this role, you must have: -Master's degree in computer science -Extensive experience and knowledge on Data solution architecting -Experience in transversal projects -Are a hands-on person, autonomous, at ease to discuss with a CEO or a field operator -Open minded, agile with change and pragmatic -Ability to drive a workstream and train final users -Can work in a multinational environment and on multiple simultaneous projects -Strong communication skills, both oral and written -Excellent teamwork and interpersonal skills -Are fluent in English: daily use required with our colleagues all over the world; -If you are based in Europe, you are willing and able to travel.
  • 25k-40k 经验5-10年 / 本科
    消费生活 / 不需要融资 / 2000人以上
    工作职责 1.Perform an assessment on all visualization and reporting requirements and develop long term strategy for various dashboard & reporting solutions. 2.Effective communication with business partners to understand their needs, and design reports accordingly 3.Collect and understand business logic behind all the reports to translate it into data model design requirement 4.Manage projects, prepare updates and implement all phases for a project 5.Turn data into insights with actionable execution plans and influence key stakeholders to implement the solutions 6.Provide training to business teams on BI tool usage and dashboard creation 任职要求 1.Proficient in using SQL, Python and R for data manipulation and analysis 2.Experienced in data visualization and dashboard development with tools like Tableau 3.Excellent presentation, project management and people management skills 4.Bachelor degree or above in statistics, business analytics, mathematics, computer or relevant education with training on data and analytics 5.5+ years of experience of managing analytics&BI projects, with successful project experiences in using data to drive business value for senior analyst. 6.Experience in data governance, data quality management, data processing and insights 7.Strong in data visualization tool such as Tableau, PowerBI.etc), as well as in BI portal products 8.Excellent in planning and organization of project, able to use data to identify and solve problems 9.Experience in retail, CRM, supply chain and production is a plus
  • 25k-35k·14薪 经验5-10年 / 硕士
    IT技术服务|咨询 / 上市公司 / 2000人以上
    职责描述: ? Develop AI and machine learning solutions to optimise business performance across different areas of the organisation; ? Build exploratory analysis to identify the root causes of a particular business problem; ? Explore and analyze all the to understand customer behavior and provide better customer experience; ? Use data analytics to help in-country Marketing, Distribution, Operation, compliance and HR teams to increase revenue, lower costs and improve operational efficiency; ? Identify actionable insights, suggest recommendations, and influence the direction of the business of cross functional groups; ? Build MIS to track the performance of implemented models to validate the effectiveness of them and show the financial benefits to the organization; ? Support the data analytics team on data extraction and manipulation; ? Help decision analytics team on the creation of documentations of all analytical solutions developed; ? Build data infrastructure necessary for the development of predictive models and more sophisticated data analysis; ? Development of data analysis for different business units 任职要求: ? Studying a numerate discipline e.g. Actuarial science, Mathematics, Statistics, Engineering, or Computer Science with a strong computer programming component; ? Demonstrable understanding of data quality risks and ability to carry out necessary quality checks to validate results obtained; ? Knowledge of a variety of machine learning techniques (clustering, decision tree, random forest, artificial neural networks, etc.) ? Proven hands-on experience in the use of at least one advanced data analysis platform (e.g. Python, R); ? Sound knowledge of programming in SQL or other programming experience; ? Articulate, with excellent oral and written communication skills. Fast learner with a willing attitude. Intellectually rigorous, with strong analytical skills and a passion for data;
  • 软件服务|咨询,IT技术服务|咨询 / 不需要融资 / 50-150人
    外企银行 双休 不加班 实际有五个岗位: 4-9年:数据分析,deep learning, NLP, transformer model 3-6年:数据开发,Hadoop, Spark, Kafka, SQL, Flink 3-6年:数据开发,Python, SQL, pandas, database 2-6年:Machine Learning Engineer,Python, Docker, CI/CD, Machine Learning and/or ML model monitoring experience a plus 6-9年:Experience in HDFS, Map Reduce, Hive, Impala, Sqoop, Linux/Unix technologies. Spark is an added advantage. Job Duties & Responsibilities: • Support finance regulatory reporting projects & applications as ETL developer | Big Data applications by following Agile Software development life cycle | level 2/3 support of the data warehouse.
  • 25k-35k·13薪 经验5-10年 / 本科
    消费生活 / 不需要融资 / 2000人以上
    1. Who you are As a person you are motivated to continuously develop and enhance programming skills and knowledge of machine learning and AI applications. The IKEA Business and our values and how they apply to the data management process is your passion. Furthermore, you are energized by working both independently and interdependently with the architecture network and cross functions. Appreciate the mix of strategic thinking and turning architecture trends into practice. Last but not least you share and live the IKEA culture and values. In this role you have proven advanced training in (computer) engineering, computer science, econometrics, mathematics or equivalent. You have experience and knowledge of working with large datasets and distributed computing architectures. You have the knowledge of coding (R and/or Python) and development of artificial intelligence applications. You have experience in statistical analysis and statistical software. In addition, you are confident in data processing and analysis. You have demonstrable experience in working in an Agile or DevOps working set-up You have knowledge in following areas: • Knowledge of data set processes for data modelling, mining and production, as well as visualization tools e.g. Tableau • Knowledge of at least one building block of artificial intelligence • Knowledge of machine learning development languages (R and/or Python) and other developments in the fast moving data technology landscape e.g. Hive or Hadoop • Knowledge of DevOps and agile development practices • Exposure to development of industrialised analytical software products • Knowledge of IKEAs corporate identity, core values and vision of creating a better everyday life for the many people We believe that you are able to work with large amounts of raw data and feel comfortable with working with different programming languages. High intellectual curiosity with ability to develop new knowledge and skills and use new concepts, methods, digital systems and processes to improve performance. The ability to provide input to user-story prioritisation as appropriate based on new ideas, approaches, and strategies. The ability to understand the complexity of IKEA business and the role of data and information as an integrated part of the business. 2. Your responsibilities As Data Scientist you will perform predictive and prescriptive modelling and help develop and deploy artificial intelligence and machine learning algorithms to reinvent and grow the IKEA business in a fast changing digital world. You will: • Explore and examine data for use in predictive and prescriptive modelling to deliver insights to the business stakeholders and support them to make better decisions • Influence product and strategy decision making by helping to visualize complex data and enable change through knowledge of the business drivers that make the product successful • Support senior colleagues with modelling projects, identifying requirements and build tools that are statistically grounded, explainable and able to adapt to changing attributes • Support the development and deployment of different analytical building blocks that support the requirements of the capability area and customer needs • Use root cause research to identify process breakdowns and provide data through the use of various skill sets to find solutions to the breakdown • Work closely with other data scientists and across functions to help produce all required design specifications and ensure that data solutions work together and fulfill business needs • Work across initiatives within INGKA Group, steering towards data-driven solutions
  • 40k-60k 经验5-10年 / 本科
    汽车丨出行 / 上市公司 / 2000人以上
    The Role We are looking for a Data Engineer to be part of our Applications Engineering team. This person will design, develop, maintain and support our Enterprise Data Warehouse & BI platform within Tesla using various data & BI tools, this position offers unique opportunity to make significant impact to the entire organization in developing data tools and driving data driven culture. Responsibilities: • Work in a time constrained environment to analyze, design, develop and deliver Enterprise Data Warehouse solutions for Tesla’s Sales, Delivery and Logistics Teams. • Setting up, maintaining and optimizing bigdata platform for production usage in reporting, analysis and ML applications. • Establish scalable, efficient, automated processes for data analyses, model development, validation and implementation. • Create ETL pipelines using Spark/Flink. • Create real time data streaming and processing using Open source technologies like Kafka , Spark etc. • Develop collaborative relationships with key business sponsors and IT resources for the efficient resolution of work requests. • Provide timely and accurate estimates for newly proposed functionality enhancements critical situation. • Develop, enforce, and recommend enhancements to Applications in the area of standards, methodologies, compliance, and quality assurance practices; participate in design and code walkthroughs. Qualifications: Minimum Qualifications: • Proficient experience in building large scale Spark batch applications. • Must have strong experience in Data Warehouse ETL design and development, methodologies, tools, processes and best practices. • Strong experience in stellar dashboards and reports creation for C-level executives. • Have a good understanding on one module, one id and one service. Preferred Qualifications: • 3+ years of development experience in Open Source technologies like Scala, Java. • Excellent Experience with Hortonworks/Cloudera platforms. • Practical experience in using HDFS. • Relevant working experience with Docker and Kubernetes preferred.
  • 30k-35k·15薪 经验3-5年 / 本科
    汽车丨出行 / 未融资 / 500-2000人
    BMW is focusing on the development of automated driving (AD) with great effort. In order to develop and test highly intelligent software for environmental perception, maneuver planning and acting, which finally enables the self-driving capability of an autonomous car, a huge amount of environmental sensor data needs to be collected, hosted and processed in an AD Development Platform (AD Data Center). The processing covers e.g. broad data analytics, reprocessing and KPI evaluation, simulation, virtual endurance run, etc. The BMW AD team in China is fully integreated into the global network of highly educated experts of different domains. In this job you will play a crucial role at the interfaces between BMW AD SW development and IT teams as well as Chinese internet tech-companies and high-tech AD suppliers. Big data, distributed data and RaaS(Reprocessing as a Service) (e.g. MapReduce, HDFS,Airflow,Jenkins,CI\CD etc.,) will be your core focus. Your prime target will be to accelerate the AD product development by analyse FTs requriments and designing big data architectures , tasking the development of related big data applications and managing their deployment to a massive off-site data center BMW just developed in cooperation with a big Chinese Tech-Player. In order to achieve this target you will be exposed to a complex project environment involving both BMW internal and external partners. You will be working with are Docker, Openshift, Kubernetes, Grafana, SPARK etc. switch global environment to *****. Major Responsibilities: -Design, deployment and improvement of distributed, big data systems targeting automated driving applications; -Tasking the development and improvement of off-board backend big data applications and related communication protocols targeting automated driving use cases; -Management and structuring of requirements provided by BMW's AD development teams regarding the big data AD platform; -Steering of BMW China AD development teams and external cooperation partners (big Chinese Tech-Player) regarding data driven development; -Review, monitor and report the status of own project; -Support in the budget planning process for the organization unit; -Research, recommend, design and develop our Big Data Architecture. Ensures system, technical and product architectures are aligned with China data platfrom strategy; Qualifications: -Master/Bachelor’s degree in Computer Science,Electrical/Electronic Engineering, Information Technology or another related field or Equivalent; -Good communication skills, good language skills in English; German language skills optional and appreciated; -Rich experience in big data processing Hadoop/Spark ecosystem applications like Hadoop, Hive, Spark and Kafka preferred; -Solid programming skills, like Java, Scala or Python; -Rich experience in docker and Kubernetes; -Familiar with CI/CD tools, like Jenkins and Ansible; -Substantial knowledge and experience in software development and engineering in (distributed big data) cloud/ compute/ data lake/datacenter systems (MAPR,Openshift,docker etc.,) -Experience with scheduling tools is a plus, like Airflow and Oozie preferred; -Experience with AD data analysis e.g. Lidar, Radar, image/video data and CAN bus data. -Experience in building and maintaining data quality standards and processes; -Strong background working with Linux/UNIX environments; -Strong Shell/Perl/Ruby/Python scripting experience; -Experience with other NoSQL databases is a plus, like Elasticsearch and HBase preferred; -Solid communication skills with ability to communicate sophisticated technical concepts and align on decisions with global and China ***** partners; -Passion to stay on top of the latest happenings in the tech world and an attitude to discuss and bring those into play; -Must have strong hands-on experience in data warehouse ETL design and development, be able to build scalable and complex ETL pipelines for different source data with different format;
  • 12k-18k 经验1-3年 / 大专
    信息安全,数据服务 / 不需要融资 / 15-50人
    Roles & Responsibilities Responsible for designing, developing, and maintaining business intelligence solutions using Tableau. Manage MS SQL databases and develop SQL scripts – views and stored procedures. Work on SSIS platform for data integration and workflow applications. Creating data models to support the design and development of Tableau reports and dashboards. Performing unit testing of reports and dashboards to ensure that they meet the specifications and requirements. Collaborate cross-functional teams, business analysts and stakeholders to understand the data requirements and design visualizations that provide insights and support decision-making. Documentation if the design, development, and maintenance of reports and dashboards, such as creating user manuals and training materials for end-users. Job Requirements To have minimum of 2 years of relevant experience Knowledge of SQL & Tableau Is a MUST requirement Knowledge of MS SQL/ SSIS is a must. Knowledge in Excel VBA/ Python is a big advantage. Skills in Microsoft office like Word, PowerPoint and Excel is essential. Strong verbal and written communication.