Hadoop hbase nutch trabalhos

Filtro

Minhas pesquisas recentes
Filtrar por:
Orçamento
para
para
para
Tipo
Habilidades
Idiomas
    Estado do Trabalho
    2,000 hadoop hbase nutch trabalhos encontrados, preços em USD

    Position: Hadoop Big Data Developer Type: Remote Screen Sharing Duration: Part-Time Monday to Friday 4 hours a day Salary: $700 Per Month (57,000 INR per month) Start Date: ASAP We are looking for a Hadoop Big Data Developer with experience in Hadoop, Spark, Sqoop, Python, Pyspark, Scala, Shell Scripting, and Linux. We are looking for someone who can work in the EST time zone connecting via remote i.e zoom, google meet on a daily basis to assist in completing the tasks. Here we will be working via screen share remotely, no environment setup will be shared.

    $725 (Avg Bid)
    $725 Média
    21 ofertas
    Desarrollador FICO Encerrado left

    ...importantes clientes. ⬇⬇ Requisitos ⬇⬇ Características de los profesionales especialistas del Servicio: - Sólidos conocimientos en herramientas FICO-Blaze/RMA y DMP Streaming, - Sólidos conocimientos en Arquitectura y Sistemas TI. - Validación de Pruebas de Concepto (POC). - Experiencia en Arquitectura de Integración. - Conocimiento en Blaze-RMA, DMPS, Hbase-Intermediate, Hadoop Intermediate, Hive Intermediate y Kafka. Solución de FICO DMPS y Blaze contemplan las siguientes actividades tanto correctivas como también evolutivas: · Resolución de dudas sobre herramientas FICO y otras herramientas del proyecto. · Control de acceso en las herramientas d...

    $12 / hr (Avg Bid)
    $12 / hr Média
    1 ofertas

    Require help for a collage project which requires creating four nodes in a single system and upload a data set. Perform some basic queries to retrieve info from HDFS

    $239 (Avg Bid)
    $239 Média
    5 ofertas

    A project that recommends movies based on collaborative, content and hybrid based filtering. Must use hadoop

    $117 (Avg Bid)
    $117 Média
    3 ofertas

    We are looking for Big data engineer trainer who has real time experience in Python, SQL, Pyspark, Hadoop concepts and good knowledge on AWS services like Glue, Athena, Lambda, EMR, S3, Apache airflow

    $638 (Avg Bid)
    $638 Média
    16 ofertas
    Big data hadoop -- 3 Encerrado left

    For a assignment of big data hadoop using python

    $23 (Avg Bid)
    $23 Média
    9 ofertas
    Big data hadoop Encerrado left

    Want someone who can make project of big data python hadoop

    $196 (Avg Bid)
    $196 Média
    16 ofertas

    This is strictly a WFO job. Only local candidates from Chennai OR those who are ready to relocate to Chennai should apply. Duration: 6 months plus Role1: Bigdata, Hadoop,sprk,airflow, CICD, python (scripting), devops. 3-8 years experience. Role 2: Data product manager - Tableau, SQL queries with managerial skills 5-8 years experience. Role 3: BI engineer - SQL,SQL Server, ETL, Tableau, data modelling, scripting, agile, python 5-8 years experience Role 4: Data Engineer - Big data, Hive, Spark, Python 3-7 years experience Very good communication skills is mandatory Must be ready to work from our office in Chennai Timings: 9 hours, IST business hours, Monday - Friday.

    $1332 (Avg Bid)
    $1332 Média
    2 ofertas

    ...задачи по ETL 50%, а также 10% ML и 40% DS. Стек: SQL+PL/SQL Greenplum, Teradata, MSSQL, MySQL, SQLite,… DWH+ETL работа с хранилищами данных Hadoop Hive, Impala, Spark, Oozie, … Python pandas, numpy, pyspark, … Machine Learning Что делать: Рефакторинг прототипов моделей машинного обучения от команды DataScience – адаптация кода к пайплайну поставки моделей в промышленную эксплуатацию с сохранением результатов и оценки моделей в хранилище Greenplum (MLOps) Проектирование и разработка корпоративной аналитической платформы Разработка процессов построения пакетной и near real time аналитики Разработка, поддержка и оптимизация ETL на платформах Greenplum и Hadoop Поддержание технической документации в актуальном состоянии

    $2341 (Avg Bid)
    $2341 Média
    6 ofertas

    wordpress site build + customize So php, node.js, Java, .NET Hadoop?

    $1183 (Avg Bid)
    $1183 Média
    118 ofertas

    / fs needs reconfiguration for hdfs layout

    $25 / hr (Avg Bid)
    $25 / hr Média
    8 ofertas

    Hi Maste...needed 4. Ability to bring a vision to life 5. Honesty and realism when it comes to agreed project deadlines 6. Reasonably accessible when needed 7. Available to provide continuous feedback as appropriate Plugins and Algorithms: • WP Web Scraper, Web Scraper Shortcode, Web Scraper, Web Scraper and SEO Tool for web scraping • Scrapy (Python), Beautiful Soup (Python) • Cheerio (JavaScript), Apache Nutch • Heritrix • Application Programming Interfaces (APIs) • Parsehub, Scrapinghub, Octoparse for data extraction • Tableau, Power BI, Looker • AI Chatbot for AI plugin enhancements. • Google Maps API, Google Search API for Application Programming Interfaces (APIs) Note: The above plugins and algorithms are not limited and may or ma...

    $50 (Avg Bid)
    $50 Média
    1 ofertas

    Includes Java in coding part and other than that we require experience in Aws, Hadoop, and Spark

    $5 / hr (Avg Bid)
    $5 / hr Média
    6 ofertas

    Hi, I am looking data analyst job timing is US healthcare claims and provide required support in Excel,sql,Db2,Hadoop,Informatica (basics).Daily one or two hours

    $474 (Avg Bid)
    $474 Média
    29 ofertas

    We are is searching for an accountable, multitalented data engineer to facilitate the operations of our data scientists. The data engineer will be responsible for employing ...technological advancements that will improve the quality of your outputs. Data Engineer Requirements: Bachelor's degree in data engineering, big data analytics, computer engineering, or related field. Master's degree in a relevant field is advantageous. Proven experience as a data engineer, software developer, or similar. Expert proficiency in Python, C++, Java, R, and SQL. Familiarity with Hadoop or suitable equivalent. Excellent analytical and problem-solving skills. A knack for independence and group work. Scrupulous approach to duties. Capacity to successfully manage a pipeline of duties with ...

    $43 / hr (Avg Bid)
    $43 / hr Média
    18 ofertas

    diseño y creación de una infraestructura OpenStack para implementar una plataforma Big Data basa en hadoop/Spark. así como la implementación de la misma. Dentro del proyecto se necesitan tres perfiles: Administrador OpenStack Ingeniero Open Stack Desarrollo catálogo IT Los trabajos se realizarán mayoritariamente en Madrid más detalles en el archivo adjunto

    $33931 (Avg Bid)
    $33931 Média
    4 ofertas

    ...has experience in writing on topics like AWS Azure GCP DigitalOcean Heroku Alibaba Linux Unix Windows Server (Active Directory) MySQL PostgreSQL SQL Server Oracle MongoDB Apache Cassandra Couchbase Neo4J DynamoDB Amazon Redshift Azure Synapse Google BigQuery Snowflake SQL Data Modelling ETL tools (Informatica, SSIS, Talend, Azure Data Factory, etc.) Data Pipelines Hadoop framework services (e.g. HDFS, Sqoop, Pig, Hive, Impala, Hbase, Flume, Zookeeper, etc.) Spark (EMR, Databricks etc.) Tableau PowerBI Artificial Intelligence Machine Learning Natural Language Processing Python C++ C# Java Ruby Golang Node.js JavaScript .NET Swift Android Shell scripting Powershell HTML5 AngularJS ReactJS VueJS Django Flask Git CI/CD (Jenkins, Bamboo, TeamCity, Octopus Deploy) Puppet/Ansible...

    $34 (Avg Bid)
    $34 Média
    23 ofertas

    We are leading training center Ni analytics india looking for Experienced Data Engineer to train our students online live class on weekdays / weekends. ideal candidate should have data engineer work experience of 4 to 8 years on Bigdata hadoop, spark, pyspark, kafka, azure experience etc. we are requesting interested candidates within our budget to respond as we get regular enquiry from individual or corporate firms. this is urgent requirement kindly respond quickly. thank you

    $367 (Avg Bid)
    $367 Média
    4 ofertas

    ...disk volume of a powered down vm, causing vdfs missing file. Need to figure out how to recover the missing volume if at all possible. Also, there should be an old backup of the vm if we can't fix it but need to try the recovery first. Task: 1. Recover volume on vm. 2/3. Move VM backups/copies from 4 existing vm's to new 4 tb HDD drive (currently unmounted). These 4 vm's host 4 nodes of a hadoop CDH cluster environment so the VM's can have their disk partitions safely expanded. Currently they share hdd's so they are limited in size. 4. In those existing 4 vm's, maintain existing partitions, expand storage to utilize full capacity of 1x4tb drive per vm for 4x4tb HDD's, 1 mounted to each VM. There should currently be 4 partitions per ...

    $83 (Avg Bid)
    $83 Média
    9 ofertas

    ...volume of a powered down vm, obviously that does not end well. Need to figure out how to recover the missing volume if at all possible. Also, there should be an old backup of the vm if we can't fix it but need to try the recovery first. Task: 1. Recover volume on vm. 2/3. Move VM backups/copies from 4 existing vm's to new 4 tb HDD drive (currently unmounted). These 4 vm's host 4 nodes of a hadoop CDH cluster environment so the VM's can have their disk partitions safely expanded. Currently they share hdd's so they are limited in size. 4. In those existing 4 vm's, maintain existing partitions, expand storage to utilize full capacity of 1x4tb drive per vm for 4x4tb HDD's, 1 mounted to each VM. There should currently be 4 partitions pe...

    $32 / hr (Avg Bid)
    $32 / hr Média
    5 ofertas

    ...volume of a powered down vm, obviously that does not end well. Need to figure out how to recover the missing volume if at all possible. Also, there should be an old backup of the vm if we can't fix it but need to try the recovery first. Task: 1. Recover volume on vm. 2/3. Move VM backups/copies from 4 existing vm's to new 4 tb HDD drive (currently unmounted). These 4 vm's host 4 nodes of a hadoop CDH cluster environment so the VM's can have their disk partitions safely expanded. Currently they share hdd's so they are limited in size. 4. In those existing 4 vm's, maintain existing partitions, expand storage to utilize full capacity of 1x4tb drive per vm for 4x4tb HDD's, 1 mounted to each VM. There should currently be 4 partitions pe...

    $85 (Avg Bid)
    $85 Média
    3 ofertas

    ...volume of a powered down vm, obviously that does not end well. Need to figure out how to recover the missing volume if at all possible. Also, there should be an old backup of the vm if we can't fix it but need to try the recovery first. Task: 1. Recover volume on vm. 2/3. Move VM backups/copies from 4 existing vm's to new 4 tb HDD drive (currently unmounted). These 4 vm's host 4 nodes of a hadoop CDH cluster environment so the VM's can have their disk partitions safely expanded. Currently they share hdd's so they are limited in size. 4. In those existing 4 vm's, maintain existing partitions, expand storage to utilize full capacity of 1x4tb drive per vm for 4x4tb HDD's, 1 mounted to each VM. There should currently be 4 partitions pe...

    $22 (Avg Bid)
    $22 Média
    2 ofertas

    Need java expert with experience in Distributed Systems For Information Systems Management, it will invlove the usage of MapReduce and Spark Linux and unix commands Part 1 Execute a map reduce job on the cluster of machines Requires use of Hadoop classes Part 2Write a Java program that uses Spark to read The Tempest and perform various calculations. The name of the program is TempestAnalytics.java. I will share full details in chat make ur bids

    $665 (Avg Bid)
    $665 Média
    7 ofertas

    Need java expert with experience in Distributed Systems For Information Systems Management, it will invlove the usage of MapReduce and Spark Linux and unix commands Part 1 Execute a map reduce job on the cluster of machines Requires use of Hadoop classes Part 2Write a Java program that uses Spark to read The Tempest and perform various calculations. The name of the program is TempestAnalytics.java. I will share full details in chat make ur bids

    $884 (Avg Bid)
    $884 Média
    6 ofertas
    Data Analyst Encerrado left

    Digital Analyst: Job Responsibilities: The Analyst will work with lead analysts to deliver analytics by a. Building analytics products for to deliver automated, scaled insights in self-serve manner (on PBI/Tableau platform) b. Assisting with complex data pulls and data manipulation to develop Analytics dashboards or conduc...understanding of digital and data analytics • Excellent written, oral, and communication skills • Strong analytical skills with the ability to collect, organize, analyse, and disseminate significant amounts of information with attention to detail and accuracy • Keen eye for UI on PBI/Tableau – can recommend designs independently • Can handle complicated data transformations on DBs & Big Data (Hadoop) • Familiar...

    $12 (Avg Bid)
    $12 Média
    2 ofertas
    Hive Projects Encerrado left

    A mini project with report, source code on any topic in HIVE and Hadoop program projects.

    $65 (Avg Bid)
    $65 Média
    3 ofertas
    Digital Analyst Encerrado left

    Job Responsibilities: The Analyst will work with lead analysts to deliver analytics by a. Building analytics products for to deliver automated, scaled insights in self-serve manner (on PBI/Tableau platform) b. Assisting with complex data pulls and data manipulation to develop Analytics dashboards or conduct analytics deep di...understanding of digital and data analytics • Excellent written, oral, and communication skills • Strong analytical skills with the ability to collect, organize, analyse, and disseminate significant amounts of information with attention to detail and accuracy • Keen eye for UI on PBI/Tableau – can recommend designs independently • Can handle complicated data transformations on DBs & Big Data (Hadoop) • Familiarit...

    $28 (Avg Bid)
    $28 Média
    6 ofertas
    Hadoop EMR setup Encerrado left

    Hadoop EMR setup and Data migration from azure to AWS

    $19 / hr (Avg Bid)
    $19 / hr Média
    11 ofertas
    Hadoop Expert Encerrado left

    Looking for a person who can help me install a Hadoop

    $5 / hr (Avg Bid)
    $5 / hr Média
    2 ofertas

    .../ Define the problem. Create Tables with constraints Design a Schema based on tables and explain the schema. Create primary keys, foreign keys. Create Procedures. Create functions. Create Views Create Index Use of the following Clauses: Example : order by, between, group by, having, order by, AND, OR, with Use Aggregate Functions Use of nested queries, Scalar Subquery. Part 2 has to be done in HBASE Create Tables – 4 tables with Column family and columns Column family - 5 column families: Make sure have different parameter. Ex: versions Minimum 4 Columns in each Column family Insert records Delete records Perform basic queries like your assignment1 Try to extract data using timestamp Insert partial data in a row Describe table. Check table status – enabled or disable...

    $145 (Avg Bid)
    $145 Média
    33 ofertas
    Database/ HBASE/ SQL Encerrado left

    .../ Define the problem. Create Tables with constraints Design a Schema based on tables and explain the schema. Create primary keys, foreign keys. Create Procedures. Create functions. Create Views Create Index Use of the following Clauses: Example : order by, between, group by, having, order by, AND, OR, with Use Aggregate Functions Use of nested queries, Scalar Subquery. Part 2 has to be done in HBASE Create Tables – 4 tables with Column family and columns Column family - 5 column families: Make sure have different parameter. Ex: versions Minimum 4 Columns in each Column family Insert records Delete records Perform basic queries like your assignment1 Try to extract data using timestamp Insert partial data in a row Describe table. Check table status – enabled or disable...

    $45 (Avg Bid)
    $45 Média
    10 ofertas

    Linux+Hadoop cloud migration azure Data and on prem Data (Cloudera hadoop) to AWS Cloudera Azure AWS DEVOPS Database Migration from on prem to AWS

    $19 / hr (Avg Bid)
    $19 / hr Média
    10 ofertas

    ※ Please, see the attached, and offer your price quote with questions [Price and time is negotiable] ※ Will need your help from end of Dec ~ Jan, 2023 1) Manual : Creating development and installation manual for overall service implementation guideline using HDFS – Impala API >All details must be provided : command/option/setting file/Config etc. > We will use your manual to create our own HDFS used solution >Additional two to four weeks of take-over time [We can ask some questions when the process does not work under the manual process] 2. Consulting : Providing solutions for the heavy load section(date inter delay) when data is insert through HDFS >Data should be processed in 3 minutes, but sometimes it takes more time > Solutions for how we can remove or de...

    $999 (Avg Bid)
    $999 Média
    7 ofertas

    Hadoop,linux, anisible,cloud and good communication skills required

    $7 / hr (Avg Bid)
    $7 / hr Média
    1 ofertas

    Hello All, The objective of this subject is to learn how to design a distributed solution of a Big Data problem with help of MapReduce and Hadoop. In fact, MapReduce is a software framework for spreading a single computing job across multiple computers. It is assumed that these jobs take too long to run on a single computer, so you run them on multiple computers to shorten the time. Please stay auto bidders Thank You

    $100 (Avg Bid)
    $100 Média
    4 ofertas

    Need bigdata and Hadoop tools some them like spark sql, Hadoop, hive and databricks , data lakes

    $30 (Avg Bid)
    $30 Média
    6 ofertas

    Hello All, The objective of this subject is to learn how to design a distributed solution of a Big Data problem with help of MapReduce and Hadoop. In fact, MapReduce is a software framework for spreading a single computing job across multiple computers. It is assumed that these jobs take too long to run on a single computer, so you run them on multiple computers to shorten the time. Please stay auto bidders Thank You

    $105 (Avg Bid)
    $105 Média
    5 ofertas

    Require a developer who has good experience in devops support for 2 to 3 years, Which includes Hadoop Services windows, Linux and Ansible with little cloud touch.

    $8 / hr (Avg Bid)
    $8 / hr Média
    7 ofertas

    Hello All, The objective of this subject is to learn how to design a distributed solution of a Big Data problem with help of MapReduce and Hadoop. In fact, MapReduce is a software framework for spreading a single computing job across multiple computers. It is assumed that these jobs take too long to run on a single computer, so you run them on multiple computers to shorten the time. Please stay auto bidders Thank You

    $123 (Avg Bid)
    $123 Média
    4 ofertas

    Hello All, The objective of this subject is to learn how to design a distributed solution of a Big Data problem with help of MapReduce and Hadoop. In fact, MapReduce is a software framework for spreading a single computing job across multiple computers. It is assumed that these jobs take too long to run on a single computer, so you run them on multiple computers to shorten the time. Please stay auto bidders Thank You

    $140 (Avg Bid)
    $140 Média
    6 ofertas

    Hello All, The objective of this subject is to learn how to design a distributed solution of a Big Data problem with help of MapReduce and Hadoop. In fact, MapReduce is a software framework for spreading a single computing job across multiple computers. It is assumed that these jobs take too long to run on a single computer, so you run them on multiple computers to shorten the time. Please stay auto bidders Thank You

    $97 (Avg Bid)
    $97 Média
    3 ofertas

    The objective of this assignment is to learn how to design a distributed solution of a Big Data problem with help of MapReduce and Hadoop. In fact, MapReduce is a software framework for spreading a single computing job across multiple computers. It is assumed that these jobs take too long to run on a single computer, so you run them on multiple computers to shorten the time.

    $120 (Avg Bid)
    $120 Média
    16 ofertas

    1. Implement the straggler solution using the approach below a) Develop a method to detect slow tasks (stragglers) in the Hadoop MapReduce framework using Progress Score (PS), Progress Rate (PR) and Remaining Time (RT) metrics b) Develop a method of selecting idle nodes to replicate detected slow tasks using the CPU time and Memory Status (MS) of the idle nodes. c) Develop a method for scheduling the slow tasks to appropriate idle nodes using CPU time and Memory Status of the idle nodes. 2. A good report on the implementation with graphics 3. A recorded execution process Use any certified data to test the efficiency of the methods

    $186 (Avg Bid)
    Urgente
    $186 Média
    11 ofertas
    Stack : DATA ENG Encerrado left

    Stack : DATA ENG 1. AWS 2. SPARK / HADOOP 3. PYTHON 4. Terraform

    $13 / hr (Avg Bid)
    $13 / hr Média
    3 ofertas

    I have an input text file and a mapper and reducer file which outputs the total count of each word in the text file. I would like to have the mapper and reducer file output only the top 20 words (and their count) with the highest count. The files use and I wanna be able to run them in hadoop.

    $138 (Avg Bid)
    $138 Média
    12 ofertas

    I need help with freelance with strong knowledge in StreamSets Data Collector and/or Flink Needed freelancer with experience in Flink, Hadoop and StreamSets Data Collector for about 10 hours of consultation. 1.- I want to extract data from DB and generate every 15 minutes aggregation files ensuring there is not missing data among intervals when query is running using StreamSets. 2.- Beside that looking for a Flink options to extract data from Kafka and using tumble aggregation intervals

    $11 / hr (Avg Bid)
    $11 / hr Média
    2 ofertas

    I need help with freelance with strong knowledge in StreamSets Data Collector and/or Flink Needed freelancer with experience in Flink, Hadoop and StreamSets Data Collector for about 10 hours of consultation. 1.- I want to extract data from DB and generate every 15 minutes aggregation files ensuring there is not missing data among intervals when query is running using StreamSets. 2.- Beside that looking for a Flink options to extract data from Kafka and using tumble aggregation intervals

    $17 / hr (Avg Bid)
    $17 / hr Média
    2 ofertas

    I need help with freelance with strong knowledge in StreamSets Data Collector and/or Flink Needed freelancer with experience in Flink, Hadoop and StreamSets Data Collector for about 10 hours of consultation. 1.- I want to extract data from DB and generate every 15 minutes aggregation files ensuring there is not missing data among intervals when query is running using StreamSets. 2.- Beside that looking for a Flink options to extract data from Kafka and using tumble aggregation intervals Please contact me asap Thanks David

    $18 / hr (Avg Bid)
    $18 / hr Média
    21 ofertas

    I have some problems to be completed using Hadoop

    $12 (Avg Bid)
    $12 Média
    1 ofertas

    Hi, we are looking for experienced person in "Hadoop" Need to Give Job Support By connecting remotely and taking mouse controls for Indian guy living in US USD- 300$/Month 2hrs/day 5days/week Timings- Anytime After 7P.M IST will works Any 2hrs Before 10a,m IST

    $250 (Avg Bid)
    $250 Média
    1 ofertas