* Please apply via specify the position in the email subject.

GCP Java developer(6 Mon+ Contract in San Francisco,CA)

  • Java, pub/sub,

  • GCP strong developer

Dataproc kerberos (6 Mon+ Contract in San Francisco,CA )


  • Large scale dataproc cluster configuration Kerberos

  • Security Authentication realm GCP token broker


Hive specialist (6 Mon+ Contract in San Francisco,CA )


  • detailed understanding of Hive metastore/HCatalog internals and how to customize it

  • Fluent in Hive SQL and understand how to make performance trade offs

  • Comfortable with both MR and Tez runtimes


Hbase benchmarking (6 Mon+ Contract in San Francisco,CA )


  • HBase architecture, benchmarking and performance monitoring

  • Required: Solid understanding of HBase read path & write path (LSTM, HFile, Flushes, Memstore, Blockcache, Indexes, Compactions, Region splits, etc)

  • Strongly desired: Medium level skills in Java programming

  • Nice to have: Apache Phoenix, Google Cloud (Compute Engine, Networking, Persistent Disks, GCS, GCP Security concepts)


Dataproc big query (6 Mon+ Contract in San Francisco,CA )


  • 1+ year with DataProc on Google Cloud Platform is a must

  • Demonstrable knowledge & Experience using Google Cloud Big Query

  • 1+ year with strong GCP data experience

  • 6 months+ experience with Dataflow GCP Data engineer certification is a plus

  • 3+ year of Hadoop stack Hands-on with Java 10+ years of total experience.

Aws chef architect (12 Mon + Contract in Los Angeles,CA)


  • End-to-end solution for automated build, deployment, server, cloud infrastructure Docker and EE to manage

  •  5+ years cloud (AWS/GCP)

  •  5+ years Linux (preferably Redhat Linux and Ubuntu)

  •  5+ years managing and deploying CHEF. Very proficient in writing cookbooks and recipes

  •  3+ years container technologies (Docker/Kubernetes) Strong BASH/Ruby scripting Excellent communication, preferred experience in broadcast or media industry


AI Engineer (6 Mon + Contract in Houston,TX)


  • Design, develop, and deliver AI/machine learning enabled solutions for specific industry problems

  • Build scalable, available, and supportable processes to collect, manipulate, present, and analyze large datasets in a production environment

  • Articulate problem definition and work on all aspects of data including acquisition, exploration/visualization, feature engineering, experimentation with machine learning algorithms, deploying models

  • Develop working prototypes of algorithms, evaluate, and compare metrics based on the real-world data sets

  • Provide design input specifications, requirements, and guidance to software engineers for algorithm implementation for solution/product development

  • Deciding when a model is ready for deployment and monitoring its accuracy over time to see when it needs to be retrained or replaced

  • Collaborate and communicate between data scientist and operations professionals to help manage production machine learning lifecycle.

  • 4-13+ years of professional experience is required

  • Degree in applied math, statistics, machine learning or computer science. MS is preferred

  • Deep understanding of statistics and experience with machine learning algorithms/techniques

  • Proven programming skills, in particular Python, strong experience with DL frameworks such as Tensorflow, Torch and others

  • Scientific expertise and real-world experience in deep learning (convolutional neural networks, LSTM’s)

  • Experience in building distributed deep learning models leveraging GPUs

  • Passion for solving challenging analytical problems

  • Ability to quickly qualitatively and quantitatively assess a problem

  • Ability to work productively with team members, identify and resolve tough issues in a collaborative manner

  • Experience in applying machine learning techniques to real-world problems in a production environment

  • Experience on Anaconda Enterprise platform

  • Experience is build & deploy machine-learning models at scale on Kubernetes.  · Publications in the machine learning space is preferred

  • Worked on large-scale, real-world problems in deep learning for a startup or large Enterprise

  • Open source contribution in algorithms, libraries and frameworks in DL

  • Worked on Distributed computing environment such as Hadoop, Spark etc Exposure to NoSQL technologies such as MongoDB, Cassandra

  • Passion for data

  • Experience with participating and winning Hackathon/Kaggle like competitions in ML/DL areas


Azure data architect (6 Mon + Contract in Houston,TX)


  • Responsible for supporting the Modern data and analytics portfolio by working with customer and internal stakeholders (business clients, business analysts, developers, other architects), defining customer solutions, and delivering engagements that are innovative and exceed business requirements

  • Collaborate with other team members/Partners to develop and architect the Cloud (Azure) Data Platform and deliverables

  • Stay abreast of architectural/ industry changes; especially in the areas of big data and the cloud

  • This position will assist with architecting of customer-focused solutions based upon market and customer needs on Cloud (Azure) Data Platform. This includes the architecting of the solution and project artifacts for current and new service offerings along with the quality of the delivery to customers

  • Meets with clients to determine needs and/or identify problem areas and suggest solutions

  • Knowledge of and experience with the Cloud (Azure) Data Platform ,Azure Data Lake, Data Factory, Data Management Gateway, Azure Storage Options, DocumentDB, Data Lake Analytics, Stream Analytics, EventHubs, Azure SQL, Kafka, Parqute, Neo4J,CosmosDB. Experience with Azure Data Factory (ADF) creating multiple complex pipelines and activities using both Azure and On-Prem data stores for full and incremental data loads into a Cloud DW

  • Experience in establishing hot path and cold path data pipelines with automated batch, micro-batch, and incremental data refreshes

  • Knowledge of tabular OLAP, including PowerPivot and SSAS Tabular mode

  • Understanding of how traditional processes apply to the modern world (be able to talk about where big data and NoSQL can replace traditional enterprise data stores, Graph DB, how Azure Data Factory can supplement/replace SSIS, etc)

  • 5+ years, recent experience developing solutions in a Microsoft Azur DevOps Environment. Server environment using SQL, T-SQL stored procedures, T-SQL functions; and/or an Oracle Environment using SQL Plus and PL/SQL Coding Languages as well as familiarity with complex SQL Server scaling concepts (AlwaysOn, Geo-Scaling, Sharding, Large Scale Database Management)

  • Experience managing Azure Data Lakes (ADLS) and Data Lake Analytics and an understanding of how to integrate with other Azure Services. Knowledge of USQL and how it can be used for data transformation as part of a cloud data integration strategy

  • Experience with Python, Azure SQL DW. Understanding of when to use Azure SQL DW vs Azure SQL Server/DB and loading patterns to move data from blob or ADLS into Azure SQL DW

  • Exceptional customer engagement, interpersonal, presentation and overall communication skills

  • Ability to successfully handle multiple work streams across multiple engagements, while maintain composure and professionalism to meet tight deadlines and shifting priorities

  • Quickly learn and adapt to new business and technical concepts

  • Goal-oriented team player committed to quality and detail

  • Innovative thinker who is positive, proactive and readily embraces change.

  • Strong understanding of Big Data, Data Visualization Business Intelligence and Data Warehouse concepts and best practices, with an understanding of its strategic importance to organizations

  • Superior conceptual and analytical abilities, identifying opportunities for improvement through analysis and creative thinking.


Snowflake Architect (Long Term + Contract in Houston,TX)

  • Snow Flake with any cloud will work

  • Snowflake architect will be responsible for architecting and implementing very large scale data intelligence solutions around snowflake data warehouse

  • Solid experience and undestanding of architecting,designing and operationalization of large scale data and analytics solutions on snowflake cloud data warehouse is a must

  • The snowflake architect should have deep data management skills to support cloud big data engineering and analytics agendas,including capabilities such as cloud data warehouse and data lake,cloud data engineering on premise to cloud data migration,data ingestion and data curation.ACT as a technical and solution expert in the areas of snowflake data warehouse,data warehouse on cloud.

Senior Platform Architect (Long Term + Contract in North Houston,TX)


  • Senior Platform Architect will specialize in Solution Architecture, Development, DevOps, SysOps, Big Data, Security and Networking

  • Demonstrate knowledge of cloud architecture and implementation features (OS, multi-tenancy, virtualization, orchestration, elastic scalability)

  • Act as a Subject Matter Expert to the organization for cloud end-to-end architecture

  • Solid understanding of On-Premise and Cloud Directory Services Architecture

  • Develop solutions architecture and evaluate architectural alternatives for private, public and hybrid cloud models, including IaaS, PaaS, and other cloud services

  • Architect Azure Cloud services using Identity, Storage, Compute, Automation, Disaster Recovery, and Networking features.

  • Articulate the possibilities of the Microsoft Azure Cloud with specific emphasis on data services including Databases, Data Lakes, Analytics, AI + Machine Learning, Internet of

  • Things and Business Intelligence

  • Accelerate the customer’s digital transformation journey using Microsoft Azure SQL DB, Azure SQL DW, Cosmos DB, Big Data, and Advanced Analytics platforms

  • Ensure all cloud solutions follow security and compliance controls

  • Participate in the establishment on an automated DevOps release management pipeline which delivers tooling for next generation application development efforts (the Dev) and on-going production operations (the Ops)

  • Cultivate a Continuous Integration/Continuous Delivery mind set

  • Ensure development teams are provided a full set of DevOps ALM tools by leading the establishment of the right tooling and processes including Automated build process, environment setups, testing scripts, deployments, and production operational metrics/debugging information Platform


Platform Engineers (Long Term + Contract in North Houston,TX)

  • Design and deployment of global, highly available, large enterprise and commercial scale cloud infrastructure

  • Advanced knowledge of cloud and physical/virtual infrastructure concepts, technologies and patterns

  • Experience deploying and using Container-as-a-Service (CaaS) and Platform-as-a-Service (PaaS) technologies like Redhat OpenShift or Azure AKS

  • Docker Container Orchestration with Kubernetes

  • Authentication and Authorization protocols (OAuth 2.0, OpenID Connect, SAML)

  • Networking concepts and protocols (DNS, VPN, etc.)

  • Design/Development skills in cloud native applications, using various technologies (e.g. SpringBoot, Java, NodeJs, Ruby, GO, .Net) Ø Experience with SQL and NoSql database management

  • Experience with terraform / Ansible

  • Experience setting up CI/CD

  • Strong experience with Azure DevOps and ARM Templates 



Senior Data Engineers (Long Term + Contract in North Houston,TX)


  • Strong programming skills in .Net or Python

  • Azure architecture, implementations, and migrations

  • Strong background in data architecture, data mining, and data cleansing

  • Strong analytical skills

  • Agile and DevOps Team Experience

  • Azure resiliency and availability frameworks

  • Azure Storage - Blob, File, Block, Tables, Databases

  • Azure Compute - VM's, Web Apps, Functions

  • Azure Streaming – Event Hubs, Event Grid

  • Basic understanding and knowledge of build and release pipelines, configuration of build agents and Azure connectivity

  • Understanding of best practices to migrate Sql and NoSql databases from on-premises to cloud and cloud to cloud using various tools


Data Science Engineers (Long Term + Contract in North Houston,TX)


  • Bachelor or Masters Degree in a highly quantitative field (Computer Science, Machine Learning, Operational Research, Statistics, Mathematics, etc.) or equivalent experience

  • Industry experience in predictive modeling, data science and analysis

  • Experience in a ML or data scientist role and a track record of building ML or DL models

  • Experience using Python and/or R Ø Knowledge of SparkML

  • Experience with big-data technologies such as Hadoop/Spark

  • Ability to produce well-written and explainable production level code

  • Experience using ML libraries, such as scikit-learn, caret, mlr, mllib

  • Experience working with GPUs to develop models

  • Experience handling terabyte size datasets

  • Experience diving into data to discover hidden patterns

  • Familiarity with using data visualization tools

  • Knowledge and experience of writing and tuning SQL

Graph Expert (Long Term + Contract in North Houston,TX)


  • Experience working with a graph database or graph framework like Datastax, Neo4j, Neptune

  • Experience designing and implementing graph data models

  • Strong skills in graph query languages (Gremlin / Cypher), search/index (Solr or Lucene or ElasticSearch), Cassandra, and Hadoop tooling

  • Experience working in a Java and/or Scala environment

  • Experience with Hive and/or Hbase Experience with Spark 2.x

  • Experience with Kafka or other similar large-scale messaging technologies

  • Experience implementing web services Knowledge and understanding of SDLC and Agile/Scrum procedures, CI/CD and Automation

  • Experience with Microsoft Azure

  • NoSQL database development experience

  • Experience working with large data sets and pipelines, ideally using the Apache


UX Developer (Long Term + Contract in North Houston,TX)


  • Lead new user interface designs through the full design and development cycle (including concepts, information architecture, visual design and interaction design)

  • Strong vision for products, clear understanding of usability, user-centered design, accessibility and web standards

  • Practical exposure and experience with AngularJS, jQuery, JSON, AJAX

  • Well versed with Microsoft technology - HTML5, CSS3, SAAS, Bootstrap, REACT (or Angular), jQuery, ES6, Type scripts, C#, .NET, ASP, XML, HTML, SQL would a plus • Visual Studio, TFS, GIT Ø Writing standards-compliant front-end code for websites, web applications, and mobile solutions using HTML, CSS and JavaScript

  • Familiarity with code versioning systems (SVN, GIT)

  • Familiarity with agile methodology and Scrum

  • Solid understanding of user-centered design principles

  • Excellent communication skills and should be able to clearly articulate design/coding decisions

  • Experience presenting UX recommendations


UIDeveloper (Long Term + Contract in North Houston,TX)


  • Extensive experiences in reporting design and development

  • Build and publish customized interactive reports and dashboards along with data refresh scheduling using Tableau / Spotfire / Power BI

  • Experiencing installing, supporting, upgrading, managing licenses, backing up the server, adding and changing nodes, managing security and authentication for Tableau or Power BI

  • Create Scorecards, Dashboards using stack bars, bar graphs, scattered plots, geographical maps, line/pie graphs, and Gantt charts

  • Mastery of the elements of the graphic design – typography, color, composition and design grids

  • Performance tuning experience related to reporting queries required

  • Knowledge of data warehouse techniques, methods, logging, and dimensional modelling

  • Develop complex data models to federate data from multiple data sources. Report and dashboard performance tuning

  • Experience with Azure Cloud platform

  • Familiarity with code versioning systems (SVN, GIT) Ø Familiarity with agile methodology and Scrum.



Data Architect (Long Term + Contract in Florida)


  • 10-12 years’ experience in data modeling, Java 8, Play/Akka implementations,Proficient with Synergex DBL, ETL and Data Transmission tool

  • Must have lead as Data Modeler for at least one Enterprise level program

  • Enterprise Data Migration • Experience working with Data Lakes and Rest APIs

  • Proficient with domain driven design concepts

  • Will be responsible for technically leading the software projects through all stages of the life cycle, including responsibility for requirements capture, design, development and acceptance testing

  • Must have good analysis for troubleshooting problems & problem solving skills

  • Must have good communication skills

  • Experience working on onsite offshore model Proficient in technical documents like UML,HLD,LLD

  • Develop Extract Transform Load (ETL) process and data structures using efficient programming standards and practices.


MDM Architect (24 Mon + Contract in Bothell,WA)

Role Overview: A Customer Hub Architect at T-Mobile is responsible for the solution architecture definition to enable the capabilities described by the Analyst/BRD but in the most scalable and effective manner to create customer value. The Customer Hub Architect is the lead on the design phase of the project, although the analyst may help in the recommended solution design. The architecture team determines & documents data integration (ongoing) and acquisition (initial load) strategy in the FSD, the data acquisition strategy, and applications provisioning strategy for use by the Dev team to create the complete customer picture. Additionally the Architect needs to act as an Application (Oracle Customer HUB) SME and technical architect for the application. The architect and analyst will work together to support the development phases for the Customer Hub.

  • Lead on the Design phase of the project .Scalable and effective solution

  • Leading solution/design whiteboard sessions

  • Communicating clearly with PM on Customer Hub dates/gates/issues and escalating appropriately

  • Solution Architecture drawing, data and web services design sections of the FSD

  • Driver of architecture discussions and decisions to get closure on design phase

  • UI Design, where applicable

  • Data acquisition strategy for all prioritized attributes (for initial load)

  • Data integration strategy for all prioritized attributes (for ongoing real-time web services)

  • Source-to-Target mappings by target UCM (with DQ, rules and logic)

.Net Cloud Architect (Long Term + Contract in Louisville, KY)

Experienced Lead/Architect with design and hands on development in.Net based microsystems keeping in mind, performance and scalability and deployments to Cloud.

  • Have great knowledge and/or working experience with core design patterns, DevOps & Automation, SOA, micro-service, event-driven architecture, cloud native engineering,clouds computing architecture (PCF/Azure/AWS/GCP), and emerging technologies

  • Experience with creating conceptual and logical architecture artifact

  • Excellent communication skills to present technical solutions to the leadership

  • Ability to analyze, suggest and derive multiyear strategy and technology roadmap

  • Good to have Qualifications

  • Cloud specialist (Google Cloud platform / AWS) who has hands on experience in setting up cloud infrastructure.

SDL Tridion Architect (Long Term + Contract in Farmington , CT / Palm Beach Gardens , FL)

  • Sr. Architect level resource with expertise in .Net, MVC, SDL tridion web 8.5. hands-on SDL tridion development experience including the following:

  • Installation and deployer configuration.

  • Topology Manager configuration.

  • Core services & content delivery.

  • Event system and workflow.

  • CMS - Creating structures, Templates and adding new web content and modifying existing web content etc.

  • Experience in DXA framework.

  • Experience in SDL tridion administration areas such as blueprinting, user / group  management, publication targets, target groups.

  • Good knowledge on various modules of SDL Tridion such as ECL connector, event handler, publisher, deployer, content broker, dynamic link resolver and content porter.

  • Strong knowledge on dynamic publishing model.

  • Architect level resource with expertise in .Net, MVC, SDL tridion web 8.5. hands-on SDL tridion development experience including the following:

  • Installation and deployer configuration.

  • Topology Manager configuration.

SCrum Master(Long Term + Contract in North Houston,TX)

Tableau Developer (3 Mon + Contract in Culver City,CA)​

At least 4 years of experience in Tableau in Data Warehousing environment

  • Excellent SQL skills

  • Strong ETL skills, experience with Informatica, or similar toolsets

  • Understanding of data warehouse design

  • Excellent verbal and written communication skills

  • Business acumen 

  • Strong analytical skills 

  • Passion for delivering awesome solutions that exceed client expectations

  • Highly Desired ability to work in higly agile environment

  • Experience with Web technologies (HTML, CSS, JS)

  • Experience with connecting to SnowFlake,and Rershift as a datasource

  • Tableau Certification​

MemSQL( Contract)


  • Strong with Hadoop Admin 

  • Strong SQL, PL/SQL, Procedures, Packages, Triggers

  • Expertise in designing and architecting large-scale database deployments for production, development and testing environments, which include planning, configuration, installation, performance tuning, monitoring and support

  • Experience working in Massively Parallel Processing (MPP) database platforms such as Greenplum or Exadata.

  • Knowledge of and experience in emerging cloud-based databases such as MEMSQL, Snowflake, Azure

  • Good understanding of Distributed Systems and Parallel Processing architecture

  • Good knowledge in Big Data, Distributed computing and Hadoop

  • Experience in any of - Hive, MongoDB, MemSQL, Spark etc.

  • Experience working in GIT

  • Check to see data models and databases are in sync. - be able to identify, monitor, and examine data discrepancies in a large data warehouse environment

  • Familiar with SQL Netezza, Oracle, MemSQL, Hadoop Big Data etc, can construct optimized query statements

  • Good in SQL performance tunning



Devops Engineer (6 Months+Contract in Glendale,CA)


  • Collaborate across teams to build and maintain secure development and production environments

  • Ensure 24x7 availability as necessary to troubleshoot urgent issues

  • Establish amazing transparency of system performance through monitoring

  • Work directly with engineers to review their designs and improve their technical strength

  • Build, maintain and monitor cloud and corporate networks that employ enterprise security tools such as VPN, SIEM, IDS/IPS, HIDS, WAF, etc.; as well as basic Firewall Administration, corporate security

  • Drive change and develop highly reputable processes and have a desire to automate EVERYTHING

  • Maintain Cloud platforms such as AWS, container technologies such as Docker & Kubernetes, CI & CD tools, monitoring and other tools related to web operations such as New Relic

  • Web Development DevOps support for SaaS platform build with TFS leveraging Angular, .NET Core, and MS SQL Server and related technologies and plugins

  • Maintain data files and monitors system configuration to ensure data integrity. Support the delivery of mission critical data to the appropriate people in and outside the company

  • Maintain the functionality, security, and uptime of critical technology systems networks, AWS, and storage infrastructure, and communication systems

  • Support CISO with security related vendor questionnaires, evaluations, and requests



  • 4+ years of overall professional experience

  • 2+ years of demonstrable systems experience in scaling web, mobile, data and systems platforms

  • 2+ years of IDS/IPS/HIDS systems, DLP systems, firewalls, SIEM systems, and vulnerability scanning tools. Security Certifications a plus!

  • Strong hands-on technical knowledge of AWS

  • Proficient in leveraging CI and CD tools to automate testing and deployment

  • Automation experience with AWS CloudFormation

  • Experience working with the following tools: Docker, Kubernetes

  • Solutions oriented

  • Experience in Realtime Monitoring: AWS Cloud watch, New Relic and Splunk Motivated and ambitious, interested in growing and developing

  • Bonus points for experience in HITRUST or Soc2 compliant environments

Principle Data Architect (1 Year+Contract in Washington,DC)


  • Experience and Strong knowledge of financial industry

  • Mortgage industry experience is very nice to have

  • Strong ability to convert business problem into technical solution

  • Strong understanding of AWS Strong understanding of data storage mechanisms in AWS

  • Familiarity with compliance standards: CCAR, DFAST, BCBS 239, CCPA, SOX)

  • Knowledge of oracle dB architecture is nice to have

  • Knowledge of Netezza is must have Knowledge of redshift is a must



  • Data architect will interview business steak holders and tech team personnel to understand business requirements, data flow, and current challenges in several legacy data systems.

  • Data architect will design new generation aws native data lake platform. New system will need to meet compliance standards mentioned above.

Senior Talend ETL Developer (6 Months+Contract in Washington, DC)

Design, develop, validate and deploy the Talend ETL processes using Talend data integration and data quality tools in an AWS environment. Successful candidates should not only possess experience in the Agile development methodology, as this is a dynamic environment required experience and a solid exposure to many of the other technologies listed below is expected. 



  • Talend 

  • AWS Services 

  • Python/Java 

Required qualifications to be successful in this role: 


  • 6+ years of experience in ETL Development 

  • Hand on experience with Talend Development 

  • Good experience with AbInitio/Informatica 

  • Expert level understanding of ETL framework and data warehousing, as it relates to cloud/AWS 

  • Expert level demonstrated experience in developing code, implementation and adopting to cloud strategy 

  • Experience working in Cloud environments, AWS, Big data environments 

  • At least one language common to cloud platforms such as Java or Python 

  • Test-driven development and/or behavior-driven development 

  • Familiarity with GIT and managing branching

Senior Workday Financial Module Architect (6 Months+Contract in Washington, DC)

Finance IT team is seeking a Senior Workday Financial Module Architect who will be an exceptional addition to our team. As a Senior Workday Financial Architect you will be responsible for configurations of various Workday Financial modules such as General Ledger, Business Assets, Supplier Accounts, Banking and Settlement, Expenses, Procurement, Accounts Receivable, and Projects. This role demands excellent knowledge of finance business processes and you will be responsible for implementing key projects and initiatives across the Workday Financials platform. The right person for this role should have strong systems analysis background in Workday, and a demonstrated ability to learn and apply new technologies to solve business challenges. If you are someone who has a good work ethic, keen interpersonal skills, and loves working in a fast-paced environment, then this is a great role for you.


  • Work closely with technology and business partners and Architect new financial workday environment (glean ledger and sub-ledger), and design integration with existing systems in complex enterprise environment. Some of the existing system are on AWS and some are on-prem.

  • Work closely with the business partners to understand requirements and execute them thoroughly following the software development lifecycle (requirements gathering, design, development, testing, and deployment).

  • Lead Workday semi-annual upgrades, including project planning, development of test plans, and supporting business partners during the process. This will also include incorporating new functionality from the bi-annual releases, chipping in actively to Workday community (entering, monitoring and voting on brainstorms), logging Workday cases (such as bug fixes/enhancements), and identifying additional functionality that would supply to the Company’s operations.

  • Assist in security administration tasks, including assigning appropriate security roles, performing semi-annual user access reviews, and contributing to security design discussions as necessary.

  • Design/develop advance, matrix, and composite reports and dashboards based on business needs. Create and maintain important metrics that help the Accounts Payable, Procurement, and Accounts Receivable teams in daily operations.

  • Serve as the main point of contact for all Workday Financial functional modules, monitor various integration errors, and fix and resolve production issues in a timely manner.

  • Identify and prioritize Workday enhancements by partnering with business teams, and prepare the necessary training materials for new enhancements and bug fixes.

  • Collaborate with other teams on multi-functional projects providing functional support on behalf of team.

  • Demonstrate Workday knowledge and expertise to find opportunities for process improvements. Provide direction and mentorship as well as end-user support.

Required Experience:

  • Hands-on experience with  Workday Finance with minimum 3 years of experience in Workday Finance Suite.

  • Strong concepts of Workday frameworks such as Integrations, Business Processes and Security.

  • Strong API understanding.

  • Report writing capabilities including composite reports, dashboards, and custom scorecards.

  • Ability to work quickly and accurately under pressure and with time constraint

  • Excellent verbal and written communication skills with the ability to communicate effectively at all levels of the organization. 

  • Experience in AWS.


© 2020 AVM Consulting. All rights reserved.