Pyspark trabalhos

Filtro

Minhas pesquisas recentes
Filtrar por:
Orçamento
para
para
para
Tipo
Habilidades
Idiomas
    Estado do Trabalho
    873 pyspark trabalhos encontrados, preços em USD

    Entrada: tupla (id,termo) em que "id" é o identificador do documento e "termo" é uma palavra do texto já pré-processada. (Pseudocod/Python/PySpark/Spark)

    $100 (Avg Bid)
    $100 Média
    2 ofertas

    Desenvolvimento de algoritmo. sobre MapReduce, utilizando Pyspark/Spark...

    $10 - $30
    $10 - $30
    0 ofertas

    I am seeking a skilled professional proficient in managing big data tasks with Hadoop, Hive, and PySpark. The primary aim of this project involves processing and analyzing structured data. Key Tasks: - Implementing Hadoop, Hive, and PySpark for my project to analyze large volumes of structured data. - Use Hive and PySpark for sophisticated data analysis and processing techniques. Ideal Skills: - Proficiency in Hadoop ecosystem - Experience with Hive and PySpark - Strong background in working with structured data - Expertise in big data processing and data analysis - Excellent problem-solving and communication skills Deliverables: - Converting raw data into useful information using Hive and Visualizing the results of queries into the graphical representation...

    $17 / hr (Avg Bid)
    $17 / hr Média
    12 ofertas

    ...currently searching for an experienced AWS Glue expert, proficient in PYsPARK with data frames and Kafka development. The ideal candidate will have: • Expertise in data frame manipulation. • Experience with Kafka integration. • Strong PYsPARK development skills. The purpose of this project is data integration, and we will be primarily processing data from structured databases. The selected freelancer should be able to work with these databases seamlessly, ensuring efficient and effective data integration using AWS Glue. The required work would involve converting structured databases to fit into a data pipeline, setting up data processing, and integrating APIs using Kafka. This project requires a strong background in AWS Glue, PYSPARK, data frame ...

    $239 (Avg Bid)
    $239 Média
    23 ofertas

    I'm seeking assistance to develop a Python-based solution utilizing PySpark for efficient data processing using the Chord Protocol. This project demands an intermediate level of expertise in Apache Spark or PySpark, combining distributed computing knowledge with specific focus on Python programming. Key Requirements: - Proficiency in Python programming and PySpark framework. - Solid understanding of the Chord Protocol and its application in data processing. - Capable of implementing robust data processing solutions in a distributed environment. Ideal Skills and Experience: - Intermediate to advanced knowledge in Apache Spark or PySpark. - Experience in implementing distributed file sharing or data processing systems. - Familiarity with network communicati...

    $545 (Avg Bid)
    $545 Média
    40 ofertas

    ...Professional with strong expertise in Pyspark for a multi-faceted project. Your responsibilities will extend to but not limited to: - Data analysis: You'll be working with diverse datasets including customer data, sales data and sensor data. Your role will involve deciphering this data, identifying key patterns and drawing out impactful insights. - Data processing: A major part of this role will be processing the mentioned datasets, and preparing them effectively for analysis. - Performance optimization: The ultimate aim is to enhance our customer targeting, boost sales revenue and identify patterns in sensor data. Utilizing your skills to optimize performance in these sectors will be highly appreciated. The ideal candidate will be skilled in Hadoop and Pyspark wi...

    $453 (Avg Bid)
    $453 Média
    25 ofertas

    I require a Pyspark programmer to assist in reading and validating tables in a database. I need to ensure the columns in each table also exist in another source, effectively identifying any missing columns. I don't have a specific methodology for column validation in mind; I'm open to suggestions based on your expertise. You should ideally: - Be proficient in Python and pyspark - Have strong experience in data validation and database management - Be capable of completing the project promptly as I need the work done ASAP. Though time is of the essence, I still highly value quality work, and I'm looking for a knowledgeable and reliable freelancer. Your attention to detail will be key in this role.

    $24 (Avg Bid)
    $24 Média
    9 ofertas

    Build a glue etl using pyspark to transfer data from mysql to postgres. facing challenges in column mappings between the 2 sources, the target database has datatypes enums and text arrays. should solve the erros in column mappings Should have prior experience ingesting data into postgres enum datatype

    $22 / hr (Avg Bid)
    $22 / hr Média
    55 ofertas

    I am in need of an experienced data engineer with specific expertise in PySpark. This project involves the integration and migration of data from structured databases currently housed in AWS. Here's a rundown of your key responsibilities: - Data integration from various existing structured databases - Migration of the combined data to a single, more efficacious database Ideal Candidate: - Proven experience in data migration and integration projects - Expertise in PySpark is indispensable - Proficiency in manipulating AWS databases - A solid understanding of structured databases and various data formats is mandatory This project is more than just technical skills- I'm looking for someone who can understand the bigger picture and contribute to the overarching str...

    $659 (Avg Bid)
    $659 Média
    13 ofertas

    I'm looking for a professional with a strong understanding of PySpark to help transform a dataframe into JSON following a specific schema. This project's main task is data transformation to aid in data interchange. The project requires: - Expertise in PySpark - Proficiency in data transformation techniques - Specific experience in data aggregation For the transformation, I require the application of an aggregation method. In this case, we will be sorting the data. It's crucial that you are skilled in various aggregation methods, especially sorting. Your knowledge in handling critical PySpark operations is crucial for this job's success. Experience in similar projects will be highly regarded.

    $24 (Avg Bid)
    $24 Média
    19 ofertas

    Looking for an expert Azure Data Engineer to assist with multiple tasks. Your responsibilities will include: - Implementing and managing Azure Data Lake and Data Ingestion. - Developing visual reports...platforms to achieve three main objectives: - Perform sophisticated data analysis and visualization. - Enable advanced data integration and transformation. - Build custom applications to meet specific needs. Candidates should have an advanced understanding of Azure Data Lake, Power BI, and Powerapps, bringing a minimum of 6 years experience as Databricks. Proficiency in Python, SQL, PostGre SQL, and Pyspark is also required. Knowledge of GitHub and the CI/CD Process will be beneficial for this role. If you have the skills and expertise needed for this project, I'd love to...

    $34 / hr (Avg Bid)
    $34 / hr Média
    29 ofertas

    ...need to be pushed swiftly to Elasticsearch using Pyspark. Your expertise will help push all data columns from this file into Elasticsearch, establishing a more actionable access to a significant amount of data. Given the project's urgency, I'm expecting a rapid, reliable transition. While the structure for the documents remains undecided due to the project's intricacies, I'm open to suggestions that will make this process more efficient and effective. Anyone with experience in Pyspark, Elasticsearch, and vast data manipulation will have a substantial edge on this project, as these skills are highly necessary for success. A strong understanding of different data structures is also a plus. • Leading Skills Required: Proficiency in Pyspark ...

    $10 / hr (Avg Bid)
    $10 / hr Média
    3 ofertas

    ...Title: Pyspark Data Engineering Training Overview: I am a beginner/intermediate in Pyspark and I am looking for a training program that focuses on data processing. I prefer one on one and written guides as the format for the training. Skills and Experience Required: - Strong expertise in Pyspark and data engineering - Excellent knowledge of data processing techniques - Experience in creating and optimizing data pipelines - Familiarity with data manipulation and transformation using Pyspark - Ability to explain complex concepts in a clear and concise manner through written guides - Understanding of best practices for data processing in Pyspark Training Topics: The training should primarily focus on data processing. The following topics should be cov...

    $23 / hr (Avg Bid)
    $23 / hr Média
    78 ofertas

    ...training is expected to be spread across multiple days. The trainer must have the capability to provide an understanding of the major concepts and components of Apache Spark, with a focus on how to use Databricks and the Pyspark API to manipulate and visualize data. As the training progresses, the instructor should be able to explain how to develop applications using Pyspark and articulate different approaches that a data scientist would use to evaluate and test their models. The instructor should also be able to educate the users on how to deploy and maintain Pyspark applications and how to provide feedback and questions in order to improve their performance. We expect the trainer to be readily available to answer any questions and guide the users along the w...

    $103 (Avg Bid)
    $103 Média
    84 ofertas

    I am seeking an expert in the field to provide remote training in the use of Databricks and Python with PySpark. This is important for developing data processing applications with a high degree of efficiency. The training should cover areas such as data wrangling, machine learning, and Spark streaming. In order to be successful, attendees must be well-versed in Databricks, Python and PySpark, as these skills will be essential for completing the course. The course should provide a good understanding of the concepts and practical application of these tools. This training will give attendees the skills they need to analyse and manipulate large datasets, develop effective data processing pipelines, design powerful machine learning models and build reliable applications that use...

    $109 (Avg Bid)
    $109 Média
    83 ofertas

    ...S3, and RDS; Azure services; and Pyspark data processing and transformations. Essential Skills: - Proficient in AWS, specifically on EC2, S3, RDS with strong understanding of data storage and retrieval. - Expert in Azure services such as Azure SQL Database and Blob Storage. - Highly experienced in writing efficient data transformations using Pyspark. Ideal Experience: - Minimum 7 years in the field with solid experience in technical interviews and coaching. Your task will be to provide actionable insights, best practices, and expert advice to nail my upcoming technical interview. Having been on the other side of the interview table would be an added advantage. - Proven track record of performing successful data processing and transformations using Pyspark. - Prev...

    $15 / hr (Avg Bid)
    $15 / hr Média
    8 ofertas

    Experienced Python + SQL +AWS +AZURE data engineer (7+ years) for evening IST timings. For guiding in interview preparation specially for data engineering. Tasks: Should have good knowledge of pyspark, sql, pandas Should have written multiple ETL pipeline in aws and azure. Note: The freelancer must be available during evening ist timings.

    $10 / hr (Avg Bid)
    $10 / hr Média
    12 ofertas

    ...structured data such as SQL databases. Skills and experience required: - Expertise in AWS migration, specifically from another cloud provider - Strong knowledge and experience with structured data, particularly SQL databases - Familiarity with AWS Glue and Athena for data processing and analysis - Ability to work with a combination of different AWS services for optimal performance and efficiency Pyspark ,sql,python Cdk Typescript Aws glue ,Emr and andes Currently Migrating from teradata to aws. Responsibilities: - Migrate data from another cloud provider to AWS, ensuring a smooth transition and minimal downtime - Design and develop applications that utilize AWS Glue and Athena for data processing and analysis - Optimize data storage and retrieval using AWS S3 and R...

    $9 / hr (Avg Bid)
    $9 / hr Média
    14 ofertas

    ...am looking for a skilled and experienced developer to work on a personal project involving the use of CNN by pyspark for analyzing brain and lung cancer. Skills and Experience: - Proficient in using pyspark and CNN - Intermediate understanding of convolutional neural networks - Familiarity with analyzing medical data - Experience in working with cancer-related datasets - Strong problem-solving skills and attention to detail The project requires the use of specific datasets, which I already have. However, any additional assistance in acquiring relevant datasets would be appreciated. The ideal candidate should have a good understanding of CNN and be able to apply it using pyspark. Experience in analyzing medical data and working with cancer-related datasets would ...

    $36 (Avg Bid)
    $36 Média
    10 ofertas

    I am looking for a skilled professional who can help me with a project titled "synapse pyspark delta lake merge scd type2 without primary key". The ideal candidate should have experience and expertise in the following areas: Desired Outcome: - The desired outcome of the merge process is to update existing records and insert new records. Data Quality: - The level of data quality required for the outcome is high integrity, with no duplicates and full accuracy. Handling Historical Data: - There is a specific requirement to keep track of historical changes to the data. Skills and Experience: - Proficiency in Synapse, Pyspark, Delta Lake - Experience with SCD Type 2 implementation - Strong understanding of data integrity and accuracy - Ability to handle historical da...

    $331 (Avg Bid)
    $331 Média
    2 ofertas

    ...Senior Data Engineer who possesses extensive experience and proficiency in a range of key technologies and tools. The ideal candidate should have a strong background in Python, demonstrating skillful use of this programming language in data engineering contexts. Proficiency in Apache Spark is essential, as we rely heavily on this powerful analytics engine for big data processing. Experience with PySpark, the Python API for Spark, is also crucial. In addition to these core skills, we require expertise in AWS cloud services, particularly AWS Glue and Amazon Kinesis. Experience with AWS Glue will be vital for ETL operations and data integration tasks, while familiarity with Amazon Kinesis is important for real-time data processing applications. Furthermore, the candidate should hav...

    $11 / hr (Avg Bid)
    $11 / hr Média
    11 ofertas

    I am looking for an Airflow, GCP, and Python expert to assist me with my project. Candidate should have a good knowledge of DAG, GIT, pandas, agile, pyspark and Airflow.

    $291 (Avg Bid)
    $291 Média
    20 ofertas

    I am looking for a freelancer who can assist me with a Pyspark AWS ML project. The main goal of the project is data processing and transformation. I already have all the data needed for the project. The preferred timeline for this project is flexible. Skills and Experience: - Strong experience with Pyspark and AWS ML - Proficient in data processing and transformation techniques - Familiarity with machine learning model development - Ability to work within a flexible timeline

    $15 / hr (Avg Bid)
    $15 / hr Média
    29 ofertas

    Years of experience: 7+ Location: Remote - India Contract Tenure - 03-06 Months Notice Period - Immediate -15/20 Days Timings : 12pm - 9pm IST M - F AWS Data Engineer Requirements • Collaborate with business an...functions to handle data quality and validation. • Should have good understanding on S3,Cloud Formation, Cloud Watch, Service Catalog and IAM Roles • Perform data validation and ensure data accuracy and completeness by creating automated tests and implementing data validation processes. • Should have good knowledge about Tableau, with creating Tableau Published Datasets and managing access. • Write PySpark scripts to process data and perform transformations.(Good to have) • Run Spark jobs on AWS EMR cluster using Airflow DAGs.(Good to have) &...

    $3294 (Avg Bid)
    $3294 Média
    16 ofertas

    Looking for someone who has a good knowledge of Pyspark, Airflow DAGs, GitHub, Pandas and Agile Framework. Overall candidate should be well aware of the data ingestion approach. Knowledge of Google cloud platform is a Bonus

    $300 (Avg Bid)
    $300 Média
    26 ofertas

    I am looking for a skilled AWS Cloud + PySpark developer to create a Glue Streaming WordCount program. The program should be able to perform word count analysis on streaming data. I have the pyspark streaming code ready which works in my Jupyter notebook. So need help in integrating Kinesis/MSK --> Glue --> RDS/S3

    $9 (Avg Bid)
    $9 Média
    2 ofertas
    Big Data Project Encerrado left

    ...Specific Letters (Using Spark) 5. Top Selling Countries (Using Spark) 6. Item Costs (Using Spark) 7. Sales Yearwise (Using PySpark) 8. Orders per Item (Using PySpark) 9. Country with Highest Sales (Using PySpark) 10. Customer Segmentation: Use clustering algorithms to identify different customer segments. 11. Time Series Forecasting: Predict future sales using ARIMA or LSTM. 12. Anomaly Detection: Identify any anomalies or outliers that could indicate fraudulent activity. 13. Association Rule Mining: Find associations between different products in the data (Using Spark). 14. Price Elasticity: Understand how the demand for a product changes with a change in its price (Using PySpark). 15. Correlation Between Priority and Profit: Analyze if 'Order Priority&...

    $81 (Avg Bid)
    $81 Média
    10 ofertas
    Big Data Project Encerrado left

    ...Specific Letters (Using Spark) 5. Top Selling Countries (Using Spark) 6. Item Costs (Using Spark) 7. Sales Yearwise (Using PySpark) 8. Orders per Item (Using PySpark) 9. Country with Highest Sales (Using PySpark) 10. Customer Segmentation: Use clustering algorithms to identify different customer segments. 11. Time Series Forecasting: Predict future sales using ARIMA or LSTM. 12. Anomaly Detection: Identify any anomalies or outliers that could indicate fraudulent activity. 13. Association Rule Mining: Find associations between different products in the data (Using Spark). 14. Price Elasticity: Understand how the demand for a product changes with a change in its price (Using PySpark). 15. Correlation Between Priority and Profit: Analyze if 'Order Priority&...

    $71 (Avg Bid)
    $71 Média
    4 ofertas

    Its a simple dataset and I have already analysed it using pandas. I want to analyse it using Pyspark and Koalas API.

    $176 (Avg Bid)
    $176 Média
    6 ofertas
    Pyspark traning Encerrado left

    Project Description: I am looking for a PySpark trainer who has advanced experience and expertise in data processing. The ideal candidate should be able to provide a scheduled training course. Skills and Experience: - Advanced level of experience with PySpark - Strong knowledge and expertise in tools like DataBricks, Pycharm, transformation & Actions. - Ability to provide a scheduled training course

    $307 (Avg Bid)
    $307 Média
    3 ofertas

    I am seeking assistance with Pyspark and small file remediation. Specifically, I am facing file format compatibility issues. Skills and experience required: - Intermediate level of experience with Pyspark - Strong understanding of file format compatibility - Proficiency in data processing and performance optimization Project requirements: - The small files I am working with have a size of 10 GB - The goal is to resolve file format compatibility issues and ensure smooth data processing - Attention to detail is crucial to avoid any data processing errors If you have expertise in Pyspark, file format compatibility, and can efficiently handle large files, I would love to discuss this project further. Please provide any relevant experience or work samples in your prop...

    $30 (Avg Bid)
    $30 Média
    1 ofertas
    Software developers Encerrado left

    I am looking for software developers who are proficient in Python ,Pyspark ,AWS and have good experience, The project timeline is estimated to be 1-2 weeks. Skills and experience required: - Proficiency in Python programming language - Experience working with various frameworks or platforms - Must be hands on experience on AWS , Pyspark - Strong problem-solving skills - Good communication and collaboration skills.

    $4 / hr (Avg Bid)
    $4 / hr Média
    15 ofertas

    I am looking for an experienced HDFS and PySpark expert to assist me with various tasks related to data ingestion, storage, processing, and analysis. The ideal freelancer should have a strong background in these technologies and be able to provide past work examples that showcase their expertise. Key requirements: - Expertise in HDFS and PySpark Timeline: - The project is expected to be completed within 1-2 weeks. If you meet these requirements and have the necessary experience, please include details of your past work and relevant experience in your application.

    $50 / hr (Avg Bid)
    $50 / hr Média
    7 ofertas
    query in pyspark Encerrado left

    I am looking for a freelancer who can help me with a data analysis project using PySpark. I have a specific dataset that I would like to query, which is of medium size (1-10 GB). Skills and Experience: - Strong knowledge and experience in PySpark - Expertise in data analysis and data manipulation - Familiarity with working with medium-sized datasets - Ability to write efficient and optimized queries in PySpark The ideal freelancer for this project should have a strong background in data analysis and be proficient in PySpark. They should also have experience working with medium-sized datasets and be able to write efficient queries to extract meaningful insights from the data.

    $16 (Avg Bid)
    $16 Média
    4 ofertas

    ...looking for a Pyspark AWS data engineer who can help me with building and deploying ETL for machine learning models. Must initially pass a python online coding exam. Tasks: - Building ETL models using Pyspark and AWS - Deploying the models on AWS infrastructure - use terraform, spin up etl clusters, understand basic data related aws cloud tools, infrastructure and security. This is NOT a devops position but you should be able to get around and use data engineering related aws tools. Infrastructure: - The project requires migrating within aws to a new infrastructure Involvement: - partially involved in the project at half time 3-5 hours a day on a consistent reliable time of your choosing. Ideal skills and experience: - Strong experience in data engineering with P...

    $39 / hr (Avg Bid)
    $39 / hr Média
    14 ofertas
    Databricks pyspark Encerrado left

    Need help on databricks task. Need to parse fixed width file and load to unity catalog tables

    $20 / hr (Avg Bid)
    $20 / hr Média
    27 ofertas
    PySpark Developer Encerrado left

    Have a project with SQL and Python code but need to convert in spark-sql and dataframe.

    $511 (Avg Bid)
    $511 Média
    61 ofertas

    I am looking for a skilled PySpark developer to help me fix bugs in my visualization project. The specific bugs I am experiencing are related to data not displaying correctly. Skills and experience required: - Strong knowledge of PySpark and data visualization - Experience with troubleshooting and debugging PySpark projects - Familiarity with visualization tools such as Matplotlib and Seaborn The ideal candidate should be able to work efficiently and effectively to fix the bugs within a two-week timeframe. Attention to detail and the ability to analyze and interpret data accurately are essential for this project.

    $56 (Avg Bid)
    $56 Média
    9 ofertas

    Project Title: Bug Identification in pyspark project I am looking for a skilled developer who can help me identify and fix functional issues in my pyspark project. The bug is specifically affecting the data analysis section of the code. Skills and Experience: - Strong proficiency in pyspark and data analysis - Experience in identifying and fixing functional issues in pyspark projects - Familiarity with data processing and data visualization - Ability to work within a deadline, as the bug needs to be fixed within two weeks If you have the necessary skills and experience, please submit your proposal. Thank you.

    $51 (Avg Bid)
    $51 Média
    7 ofertas

    I am looking for an experienced Azure Data Engineer to work on my project specifically only from Hyderabad , India Specific Data Engineering Tasks: - Yes, I have some specific data engineering tasks in mind Preferred Tool for Data Processing and Analysis: Pyspark - Azure Databricks Skills and Experience Required: - Strong experience with Azure Data Factory, Azure Databricks, and Azure Synapse Analytics - Proficiency in data processing and analysis using Azure Databricks - Ability to handle large data sets efficiently - Knowledge of data engineering best practices and optimization techniques - Familiarity with Azure cloud services and infrastructure - Excellent problem-solving and troubleshooting skills - Strong communication and collaboration skills If you have the required sk...

    $84 (Avg Bid)
    $84 Média
    2 ofertas

    As a beginner in the world of pyspark, I am looking for an experienced developer to provide guidance as I work on my project. I have a specific project in my work that I am tackling and need assistance understanding the syntax and functions of pyspark to make sure I'm on the right track. I am looking for someone who can provide me with clear and concise instruction to help me with optimizing performance and scalability of my pyspark project.

    $14 (Avg Bid)
    $14 Média
    9 ofertas

    I am looking for a Python programmer who can work on a project involving real-time data processing. The data for the streaming would be sourced from using pyspark kafka structured streaming. The expected frequency of the streaming data is to be processed in real-time. The tasks to be completed in the project include setting up and constructing an efficient data pipeline that is capable of obtaining the data, processing the data and then running the data through analytics and further data visualization. The programmer should have experience with machine learning implementation and have a willingness to work as part of a wider team.

    $54 (Avg Bid)
    $54 Média
    15 ofertas

    Hello, We are currently seeking an experienced Python Developer to collaborate with both our Java and PySpark teams to address pending tasks. We require a Python Developer with substantial experience in handling enterprise-level data via APIs, including integration with third-party APIs. The selected developer will work closely with our development team for a duration of 1-2 weeks to finalize these tasks. To express your interest and share your relevant experience, please apply. More detailed information will be provided to candidates after the initial profile screening. Thank you.

    $8 / hr (Avg Bid)
    $8 / hr Média
    31 ofertas
    Bigdata Pyspark Encerrado left

    I am looking for a freelancer who can help me with my Bigdata Pyspark project. The main goal of this project is data analysis. I have a specific dataset that I can provide for this project. I would like the project to be completed in more than two weeks. Ideal Skills and Experience: - Strong knowledge and experience in Bigdata and Pyspark - Proficiency in data analysis techniques and tools - Experience with handling large datasets - Familiarity with data visualization techniques - Good understanding of machine learning algorithms and techniques

    $24 (Avg Bid)
    $24 Média
    16 ofertas

    Need to Solve this Error while proceissing the PayLoad In PySpark Invoked by Java on AWS Below is the error for reference.- { "status": 500, "response": "There is some error occur while Rule processing through API call. : I/O error on POST request for "http://3.219.239.160:9000/process_data": Unexpected end of file from server; nested exception is : Unexpected end of file from server", "message": "There is some error occur while Rule processing through API call. : I/O error on POST request for "http://3.219.239.160:9000/process_data": Unexpected end of file from server; nested exception is : Unexpected end of file from

    $14 / hr (Avg Bid)
    $14 / hr Média
    11 ofertas

    I am looking for help with existing pyspark code that needs to be modified. The task itself is to modify existing pyspark logic. I need someone who is knowledgeable and experienced working with pyspark. The timeline for this task is as soon as possible. I understand important details may need to be discussed, tweaked or clarified, so some flexibility is appreciated. If you are an experienced pyspark developer, I welcome your proposals to my project. Together, let’s see if we can find a solution that works for all of us!

    $22 (Avg Bid)
    $22 Média
    12 ofertas

    I need java and pyspark expert now start your bid with pyspark

    $12 (Avg Bid)
    $12 Média
    11 ofertas

    Programming: PySpark & JavaScript User should be able to input the python source code first, then the app will do the documentation of the code and let user save it (like the documentation of the function and class), and also will be able to see dependency between the classes and the source code metrics. In this project, it need to create an app and the app can let user (client) put inside/upload a python source code , and it will generate a documentation of the uploaded code (like list of function and class diagram). The output must include: All the class name and what's inside the class -class diagram to show the relationship between the class / dependency between the classes -all the function in the code (like an explanation of all the function).

    $106 (Avg Bid)
    $106 Média
    24 ofertas

    Ontology Based Program for Python Programming Environment

    $5 / hr (Avg Bid)
    $5 / hr Média
    18 ofertas

    I am looking for a freelancer who can convert my pandas code to pyspark. The dataset is small, less than 1 GB in size. I don't have specific transformations or operations in mind, but I am open to suggestions. It is important that the pyspark code is optimized for performance. Ideal skills and experience: - Strong knowledge and experience in both pandas and pyspark - Ability to understand and convert pandas code to pyspark - Familiarity with optimizing pyspark code for performance The output should be same here in python with pandas and the code with pyspark. Please Add the print statements to verify. Versions ----------------- spark - 2.4.7.7 Anaconda3-2018

    $153 (Avg Bid)
    $153 Média
    38 ofertas