data scientist will deliver an overarching data architecture model that is cloud compatible and based on service oriented architecture (soa) principles and best practices to deliver a forward leaning architectural blueprint that includes infrastructure foundational elements (i e network storage platform middleware etc )mandatory skills: demonstrated experience with analytical and data engineering skills demonstrated experience with data architecture data modeling database design and data systems implementation especially with oracle based technologies such as mysql and microsoft sql server demonstrated experience with data processing operations to design system data flows mapping interfaces used to manage data setting standards for data management analyzing current state and conceiving desired future state demonstrated experience with big data structures such as triplestores rdf and sparql clearance: current active ts sci w polydesired skills: csvkit jquery data tables tableau3 openrefine knime rapidminer or solver eclipse ant junit and apache tomcat employment: 90-day review period contract up to 12 months renewable location: milpitas casalary: negotiablezone24x7 inc is a privately held company headquartered in san jose ca we specialize in the design development and implementation of innovative custom technology for retail we are seeking to hire a talented architect lead software engineer to work on data science & big data solutions with one of our premier clients a major u s retail chain job descriptionas an architect tech lead in data science & big data solutions you will architect design develop provide onsite support and coordinate the engineering collaborations between onsite and offshore teams in engineering delivery meeting and exceeding client expectations you will create road maps that translate business requirements into technology solutions you will ensure compliance of application and technical infrastructure development with company and industry standards you influence future systems architecture you assess emerging technologies and vendors for application to business initiatives and provide technical expertise and direction for next generation initiatives you will be responsible to provide the engineering leadership (hands on) to the rest of the teams in handling multiple solutions and stakeholders including different technology vendors and offshore engineering teams role:provide technical coordination guidance and support to bridge the gap in u s to offshore work time difference participate in technical discussions with client stakeholders and translate client business needs to solution architectures and effectively communicating those solution architectures with offshore teams liaison with client stakeholders and other vendor teams in building a good relationship and provide the status updates while sorting out any complications work closely with the big data platform and microservices business unit heads architects offshore and define strategic roadmaps to deliver client’s business needs and taking the responsibility in consulting the client stakeholders understand and accept the roadmap for delivery continuous support to coordinate with various technical and business teams onsite in case of escalations coordinations required by offshore team to complete the deliveries on time be hands-on in engineering and the ability to deep dive into coding where necessary being on alert for new project opportunities from the client stakeholders and consolidating the already developed capabilities to provide quick brainstormed solutions to client needs and taking the responsibility to present such solutions to a technical audience in client-end via discussions architecture diagrams presentations etc partner with enterprise security eas devops and other engineering teams from client-end and provide necessary oversight to all projects deliveries travel offshore on a short duration basis upon request for networking broader solutioning discussions etc (company funded) required experience & skills:hands on with hive pig apache spark and hadoop stack in general experience working in both batch and streaming data experience working with machine learning methods like clustering regression classification frequent item mining and collaborative filtering strong python scala java and r language skills with experience writing mapreduce jobs udfs etc hands on experience on cloud platform such as aws gcp and azure deep understanding of database and analytical technologies in the industry including mpp databases nosql storage (big table big query or hbase) data warehouse design bi reporting and dashboard development broader and deep knowledge in big data solutioning highly scalable and fault tolerant enterprise middle-tier application solutioning development and productionalization broader and deep knowledge in software architecture design and integrations considering functional and non-functional requirements excellent communication articulation and leadership skills to connect with stakeholders at different level be able to run pilots and pocs quickly and deploy it to production ability to engage and interact with various technical business and outside teams ability to understand and assimilate any technology quickly ability to plan and execute technology product roadmap education & experiencebachelors masters degree or equivalent (mis computer science engineering preferred) professional qualifications in big data data science and or enterprise architecture will be an added advantage 6+ years of total experience inclusive at least 5+ years of solid hands on coding exposure and scars to prove while the most recent inclusive 2+ years in the role as an architect software engineer involving in data science and big data applications experience with scrum and other agile development processes job overview company summary:are you someone who can provide leadership & expertise in managing the amazon web services data lake environment for a global pharmaceutical company in the cambridge massachusetts area with a market presence in over 70 countries? are you someone technical with hands-on experience but comfortable managing a team? if so you would be very interested in learning more about a big data platforms manager opportunity! what you’ll be doing…establishing analytic environments required for structured semi- and unstructured data that includes a cloud-based infrastructure with data processing and visualization capabilities maintaining oversight of platforms technology stack and operations for aws data lakemaintaining current knowledge of big data & iot developments opportunities and challengesdeveloping innovative solutions systems and products to support objectives across multiple business functions what you should have to be successful bs in computer science engineering statistics applied math or equivalent8-10 years of relevant work experience including a few years with pharma biotech or consulting with pharma biotech preferred 3+ years of experience designing and developing cloud-based solutions (preferably through aws) machine learning data science background in informatics (bioinformatics or cheminformatics)solid understanding of etl architectures database technologies performance optimization and building communication channels between structured and unstructured databases track record of success required including effectively leading and managing diverse business functions and multiple projects with a variety of stakeholders the world of technology continues to grow and this role offers a huge opportunity to join a growing group within an established firm resumes are being considered on a first come first serve basis and interviews are already being conducted if you are interested in this role please do send your details to apply a33ho1gn376@glocomms aptrack co uk or call 646-647-3948 immediately as this role is extremely active 6-month contract-to-hire opportunity for data architect for the state of tn in nashville tn!there are two roles on the team one data scientist and one mdm that will work closely together to work enterprise-wide with all departments at sotn to create strategy architecture and planning for all database activities these folks will want to have experience in multiple environments (sql oracle cobal etc) to have success as all are a part of the environment at sotn knowledge services established in 1994 and headquartered in indianapolis in is a certified woman-owned (wbe) professional services organization with over 1500 employees located in offices throughout north america founded by julie bielawski ceo guidesoft inc dba knowledge services is an industry leader in managed service programs (msp) employer of record payrolling services national recruitment and staffing services we provide outstanding services to major organizations in various industries including; it healthcare entertainment media federal and state governments public utilities telecom manufacturing and more as such knowledge services is committed to providing opportunities for growth – in our company in each team member and in our relationships we believe titles do not define a person but provide a framework to each person’s endless potential our focus on improving our team product and processes drive us every day we are guided by our four pillars that set the foundation of who we are and how we conduct business: knowledge integrity innovation and service knowledge services has benefit offerings to include the following! medical dental and vision coveragevoluntary life and ad&d coveragepet insuranceticket and event discounts!the above are available provided contractors meet eligibility requirementsscope: data anaklysis and data privacy working enterprise-wide with all agencies (sql oracle cobol environments) to compile organize and strategize data plans strong emphasis on reporting and normalization across many different platforms from an architectural perspective sciences and informatics background summary: under direction is responsible for assisting the chief data and informatics officer to oversee all ongoing activities related to the development implementation maintenance and adherence to the state’s policies and procedures covering privacy and access distinguishing features: an employee in this class will work closely with several lines of business to develop define and manage the overall data privacy policy at an enterprise level and statewide this classification reports to the chief data and informatics officer updating and using relevant knowledge: --maintains current knowledge of applicable federal and state privacy laws and accreditation standards and monitors advancements in information privacy technologies to ensure organizational adaptation and compliance identifying objects actions and events: --performs initial and periodic information privacy risk assessments and conducts related ongoing compliance monitoring activities in coordination with the entities other compliance and operational assessment functions developing objectives and strategies: --works with other c-level executives particularly those whose areas of concern overlap such as the chief information officer (cio) the chief information security officer (ciso) the chief systems architect the chief data & informatics officer (cdio) and the enterprise security officer training and teaching others: --assists directs delivers or ensures delivery of initial and privacy training and orientation to all employees and professional staff contractors alliances business associates and other appropriate third parties in regards to data privacy organizing planning and prioritizing work: --works with domain leaders to review ongoing activities related to the development implementation maintenance or and adherence to the state’s policies and procedures covering that privacy of and access to state information in compliance with federal and state laws (i e hipaa fti pci ssa ferpa cjis fisma) --reviews all system-related information security plans throughout the state’s network to ensure alignment between security and privacy practices and act as a liaison to the information systems department evaluating information to determine compliance with standards: --serves as information privacy consultant for all departments and appropriate entities --maintains a comprehensive and current knowledge of both department operations and privacy laws as well as communicating details of the department’s privacy policy to staff and works closely with department legal staff --collaborates with and assists the business and technology areas to develop corrective action plans for identified privacy compliance issues --conducts privacy risk and impact assessments monitor processes materials or surroundings: --reviews all system-related information security plans throughout the organization to ensure alignment between security and privacy practices and acts as a liaison to the information systems department --ensures that data security practices – in particular logging monitoring and auditing practices – are not in conflict with privacy requirements --monitors the status and effectiveness of privacy controls across service offerings ensuring that privacy-related key risk indicators are effectively monitored to prevent an unacceptable impact on business objectives and reputation --ensures that business process technology teams and third-parties follow the organizations privacy program meet privacy policy requirements and address privacy concerns --develops an audit and compliance program to assure adherence to established standards establishing and maintaining interpersonal relationships: --serves as the subject matter expert to all business domain privacy officers as it relates to industry data privacy controls around regulatory data --liaises with the data architects database administrators and third parties to ensure that sensitive data is stored and monitored appropriately interpreting the meaning of information for others: --serves as an internal advisor to the it department and information security to interpret privacy policy related questions --responds to regulatory authorities and all necessary parties in the event of any data breach --advises senior management on the company’s privacy compliance --advises on any data protection questions across the organization and support business projects and initiatives to ensure compliance analyzing data & information --reviews the effectiveness of the privacy program develops recommendations for improvements and report on the adequacy of the program at least annually --serves as a facilitator for root cause analysis and correction of operational processes --analyzes the privacy and security implications of new products and services and offering guidance on ways to minimize privacy compliance risks --performs regular data discovery exercises to ensure all sensitive data is identified and monitored making decisions and solving problems: --works closely with the technology teams to anticipate potential privacy problems embedded in the use of emerging technologies (e g cloud artificial intelligence machine learning) --interfaces and coordinates with the ciso on matters affecting data security monitoring and controlling resources --oversees all matters related to personally identifiable information data protection including policy and procedures subject access requests data records retention privacy by design appropriate removal disposal of data etc education and experience: graduation from an accredited college or university with a bachelor's degree and five years of experience in information technology information privacy laws access release of information and release control technologies orsubstitution of a specific associate’s degree for the required bachelor’s degree: an associate’s degree in information technology information privacy laws access release of information and release control technologies orsubstitution of experience for education: qualifying full-time experience in one of the following areas may substitute for the required education on a year-for-year basis to a maximum of four years in information technology information privacy laws access release of information and release control technologies note: the appointing authority of the department in which the position is located reserves the right to alter or supersede the above minimum qualifications based on their judgment as warranted the primary responsibilities of this role senior data scientist data engineer - biologics are to: working closely with research & development information technology (r&d it) and other global partners;design and extend new and existing data repositories including the design of a data lake;automate data extraction transport transformation and preprocessing pipelines and processes;will manage and code ad hoc queries to enable data analytics;collaborate with internal and external colleagues and vendors to improve the operation of the data science team's infrastructure;participate in problem solving with team members from other functional teams and sites;communicate effectively through listening documentation and presentation especially using compelling visualization tools required qualifications:phd in data science computer science statistics or related field with minimum of 4 years relevant experience or;ms in data science computer science statistics or related field with minimum of 8 years relevant experience orbs in data science computer science statistics or related field with minimum of 10 years relevant experience;proficiency in python sql and or r is required prior experience with extract transform and load (etl) pipelines is a must experience leading a small team in software project management machine learning experimental design and statistical design are desirable excellent communication skills with scientists from diverse backgrounds ability to work independently and as a part of a project team ability to prioritize tasks and work on a deadline preferred qualifications: 3+ years of experience in industry;familiarity and exposure to cloud platform administration;experience with nosql databases domestic relocation assistance is offered for this position you are an awesome java developer and data nerd who thinks lambdas are incredible nosql is the way the world turns mapreduce is old school thread dumps are the best way to pass a weekend and you crash into local meetups to find cool software ideas we’re the sultry startup you’ve always wanted but haven’t found we specialize in applying data science techniques to discover value and insights in the growing volumes of data our clients are collecting people analyze we predict people create software we live software people tag conversations as “too technical” we live and breathe technical conversations if you join us in these roles your responsibilities would include stuff you love best - latest java development on the coolest open source technologies to solve intense problems in small teams of 5-10 equally awesome engineers and scientists your primary responsibility in this role will be working with trove's data storage and analytics infrastructure to implement scalable machine learning analytics in partnership with trove's data science team minimal qualifications:m s in computer science5 - 10 years of experience in a similar roleproficiency with java 8 and enterprise software design patternsknowledgeable of java 8 features like lambda functions and streams apihands-on (more than you've heard of it) experience with apache spark apache kafka and performance tuning in large-scale computing clustersstrong understanding of multi-threading and concurrency designknowledgeable in database design and optimization (sql nosql)experience participating in an agile development team going through full software development lifecycle in a product-oriented environmentyou absolutely need a creative mind and have a keen ability and the initiative to think "beyond"desired qualifications:hands-on (more than you've heard of it) experience with apache cassandra and performance tuning in large-scale computing clustersproficiency with the spring framework - boot hibernate data batchapplied experience with dockerapplied experience with hadoop client is looking for a principal big data developer that has a deep understanding of big data and will enable big data analytical solutions for client mines works with and alongside digital team and business users to continuously collect insight from maintenance data and enable client to do 'smart' maintenance responsibilities:ensure that data pipelines are scalable repeatable and secure and can serve multiple users within the companyenable big data and batch real-time analytical solutions that leverage emerging technologiescollect parse manage analyze and visualize large sets of data using multiple platformstranslate complex functional and technical requirements into detailed architecture design and high performing softwaretranslates business requirements into system requirementscode test and document new or modified data systems to create robust and scalable applications for data analyticsimplement security and recovery tools and techniques as requiredwork with developers to make sure that all data solutions are consistentensure all automated processes to preserve data by managing the alignment of data availability and integration processesdevelop standards and processes for integration projects and initiativesqualifications:minimum of 10-15 years of experience with 5 years in data sciencemaster's or phd degree in information technology computer science or a related quantitative disciplineunderstanding of high performance algorithms and r statistical softwareexperience in industry data science (e g machine learning predictive maintenance) preferredcapability to architect highly scalable distributed systems using different open source toolsdemonstrated ability to facilitate and lead others and to work with minimal direction with the proven ability to coordinate complex activitiesexcellent problem solving critical thinking and communication skillsdemonstrated experience with agile or other rapid development methodsdemonstrated experience with object oriented design coding and testing patterns as well as experience in engineering software platforms and large-scale dataexperience in developing presentations and communications to be shared with internal and external stakeholdersexpert knowledge of data modeling and understanding of different data structuresexperience using big data batch and streaming toolsbrings a high-energy and passionate outlook to the job and can influence those around themable to build a sense of trust and rapport that creates a comfortable & effective workplacepassion for innovation and "can do" attitude basic purpose:the enterprise data & analytics group at the client is looking for a big data developer to be a part of a team that designs and develops big data solutions that meet business objectives this is an exciting opportunity to work for a family-owned company that continues to experience growth and get in on the ground floor to help build the company’s big data practice the ideal candidate has a deep technical knowledge of the hadoop stack and possesses a desire to push the business further through innovation this role requires a close partnership with the data science analyst community as well as various it teams to ensure requirements are met and solutions are supportable and scalable major responsibilities:· design and implement data ingestion techniques for real time and batch processes for a variety of sources into hadoop ecosystems and hdfs clusters· visualize and report data findings creatively in a variety of visual formats that provide insights to the organization· knowledge of data master data and metadata related standards processes and technology· define and document architecture roadmaps and standards· drive use case analysis and solution design around activities focused on determining how to best meet customer requirements within the tools of the ecosystem· ensure scalability and high availability fault tolerance and elasticity within big data ecosystem· architect and develop elt and etl solutions focused on moving data from highly diverse data landscape into a centralized data lake; also architect solutions to acquire semi unstructured data sources such as sensors machine logs click streams etc · manage all activities centered on obtaining data and loading into an enterprise data lake· serve as an expert in efficient etl data quality and data consolidation· stay current with vendor product roadmaps and make recommendations for adoption· maintain a customer-focused attitudeeducation and requirements:· education:o bachelor’s degree or equivalent in information technology computer sciences or computer engineering· experience:o 8 years it experienceo 3+ years of experience building large-scale data solutions involving data architecture data integration business intelligence and data analyticso 1+ year of experience working on large-scale big data projectso deep technical knowledge of most components contained within the hadoop ecosystem (mapreduce hdfs yarn hive hbase sqoop etc ) preferably with hortonworks distributiono experience building streaming analytics solutions using nifi storm or other similar technologieso understanding of statistical and predictive modeling concepts a pluso strong java j2ee experienceo experience with visualization toolso experience with rdbms platforms such as sql server and in-memory columnar storage such as hanaskills and physical demands:· skills:o ability to manage numerous competing demands in a fast-paced environmento excellent verbal and written communication skills· typical physical demands:o requires prolonged sitting some bending and stooping o occasional lifting up to 25 pounds o manual dexterity sufficient to operate a computer keyboard and calculator o requires normal range of hearing and vision this position is part of the global payment and fraud team this is thebusiness team responsible for global payment and fraud strategy andexecution at starbucks millions of times a day customers interact withstarbucks payment systems across physical and digital platforms andcustomers expect to securely and seamlessly interact with starbuckspayment systems your role is to create tools and perform analytics thatprovide actionable insights that enable ongoing business optimizationand innovation for starbucks fraud and payment you’ll be deeply embedded in the business where the output of yourefforts will immediately improve starbucks you’ll help starbucks figureout how we can make sure that we’re always reliable for our customers inan ever more complex environment you'll build powerful new self-servicetools to make it faster and easier to for your internal customers toact finally you'll build tools that make it possible for us to findand fix problems quickly and efficiently we are a fast-paced environment using agile methodology and tailoringour designs and implementation to the maturity of the business currently we code in both oracle pl sql and microsoft bi platforms (sqlserver azure) and utilize tableau as a visualization layer this isalways subject to change as new technologies emerge our supportive teamculture encourages innovation and we expect developers and managementalike to take a high level of ownership for the product vision technical architecture and project delivery responsibilities: - develop a solid understanding of all the transaction management systems starbucks utilizes to manage our global business including our digital stored value card program digital payments as well as our physical point of sale this role will need to understand the end to end financial and workflow systems processes back-end logging structure and the impacts on relationship to the operational effectiveness and efficiency of the organizations- work with cross-functional teams of product owners operators data scientists and technical source system teams to understand data as well as business performance drivers and architect data visualization tools to enable continuous improvement - create reporting artifacts tailored to varying customers from executive to operator- raise the analytics bar by acting as a subject matter expert to business and technical teammates and enable self-service analytics - work through the entire software development life cycle with little supervision - responsible for calibration efforts and testing of tools to insure highest levels of accuracy and dependability along with documentation and user training in support of self-service analytics- work with teammates to evaluate reporting requests assess feasibility develop prototypes and implement a production delivery plan in conjunction with starbucks technology as required- utilize troubleshooting and advanced problem-solving skills to provide advanced support to systems developed by others - utilize a variety of technologies including sql server oracle azure aws or other as needed - facilitate changes into production environment through appropriate teams via established change management processes - participate in large-scale starbucks international system implementations requiring fraud or payment capabilities and integration with our global data architecture supporting fraud and payments - develop ad-hoc bi solutions as needed to solve emerging and complex business problems principal big data developersan francisco california united statesabout barrick barrick is the gold industry leader with a vision of wealth generation through responsible mining; wealth for our owners our people and the countries and communities with which we partner our objective is to maintain and grow industry-leading margins driven by innovation and our digital transformation; managing our portfolio and allocating capital with discipline and rigor; and leveraging our distinctive partnership culture as a competitive advantage we aim to cultivate a high-performance culture defined by the following principles: a deep commitment to partnership consistent execution operational excellence disciplined capital allocation and continual self-improvement we are obsessed with talent and seek out fresh perspectives and challenging ourselves to think differently as we transform barrick into a leading 21st century company position description: barrick is looking for a principal big data developer that has a deep understanding of big data and will enable big data analytical solutions for barrick’s mines works with and alongside digital team and business users to continuously collect insight from maintenance data and enable barrick to do ‘smart’ maintenance responsibilities: ensure that data pipelines are scalable repeatable and secure and can serve multiple users within the companyenable big data and batch real-time analytical solutions that leverage emerging technologiescollect parse manage analyze and visualize large sets of data using multiple platformstranslate complex functional and technical requirements into detailed architecture design and high performing softwaretranslates business requirements into system requirementscode test and document new or modified data systems to create robust and scalable applications for data analyticsimplement security and recovery tools and techniques as requiredwork with developers to make sure that all data solutions are consistentensure all automated processes to preserve data by managing the alignment of data availability and integration processesdevelop standards and processes for integration projects and initiatives qualifications: minimum of 10-15 years of experience with 5 years in data sciencemaster’s or phd degree in information technology computer science or a related quantitative disciplineunderstanding of high performance algorithms and r statistical softwareexperience in industry data science (e g machine learning predictive maintenance) preferredcapability to architect highly scalable distributed systems using different open source toolsdemonstrated ability to facilitate and lead others and to work with minimal direction with the proven ability to coordinate complex activitiesexcellent problem solving critical thinking and communication skillsdemonstrated experience with agile or other rapid development methodsdemonstrated experience with object oriented design coding and testing patterns as well as experience in engineering software platforms and large-scale dataexperience in developing presentations and communications to be shared with internal and external stakeholdersexpert knowledge of data modeling and understanding of different data structuresexperience using big data batch and streaming toolsbrings a high-energy and passionate outlook to the job and can influence those around themable to build a sense of trust and rapport that creates a comfortable & effective workplacepassion for innovation and “can do” attitude what we can offer you a comprehensive compensation package including bonuses benefits and stock purchase plans where applicableability to make a difference and lasting impactwork in a dynamic collaborative progressive and high-performing teaman opportunity to transform traditional mining into the future of digital miningopportunities to grow and learn with the industry colleagues are endlessaccess to a variety of career opportunities across barrick locations are you an experienced big data analytics engineer looking to join a global company's innovative data engineering team? are you interested in learning more about aws hadoop or spark? do you like the flexibility and culture of a startup (work from home 2 days a week) with the infrastructure of a fortune 500 company? what's the job?as a data engineer in our data engineering team you will be work in collaboration with other data engineers and data scientists to help our company maintains its status as the world's largest and most diverse education provider we are using all the newest big data and cloud based technology to achieve our vision and this is a great opportunity to learn these tools you will connect and store our company's different sources of data in the appropriate warehouse and contribute to the continuous development and improvement of etl processes and technologies you will also profile and analyze source data to determine the best reporting structures to build create source-2-target mapping as well as design and develop etl code to load and transform this source data what skills do we need?- experience with databases (relational and non-relational) we're open with this so that could mean you know oracle sql redshift - proven track record of data related projects developed in a commercial setting- strong etl experience - knowledge of one programming language preferably python or javacompensation- $90 000 - $120 000 - 401(k) plan pension- medical and dental benefits- education reimbursement who are we?we are a diverse company (20 000+ employees around the world) in the education field we offer a number of different products related to helping students succeed there are over 1 000 000 students in 30+ countries that work with us you will work in our collaborative fun environment located in midtown east what's in it for you? this is a great opportunity for an experienced data engineer and want to work for an innovative world-wide diverse company that is at the forefront of technology mastech digital provides digital and mainstream technology staff as well as digital transformation services for leading american corporations we are currently seeking a big data engineer for our client in the banking and financial services domain we value our professionals providing comprehensive benefits exciting challenges and the opportunity for growth this is a permanent position and the client is looking for someone to start immediately &duration: full-timelocation: dallas tampacompensation: depends on experience&role: big data engineer &primary skill: hadoop&role description: the big data engineer would need to have at least 7 years of experience as the big data engineer you will design and build data transformation pipelines on the company’s big data platform on the cloud to enable efficient data analysis and machine learning models through proper data engineering in a distributed architecture &as the data engineer you must be adept at data analysis data processing and mastering the software application development and deployment &specific responsibilities:&- learn the application programming interfaces (api’s) of various systems to acquire ingest and process the data - design interfaces to these applications using open source tools such as scala java python perl and shell scripting - create data pipelines to maintain stable dataflow to the machine learning models – both in batch mode and near real-time mode - support the chief data scientist data scientists big data engineers in creating new and novel approaches to solving challenging problems using machine learning big data and cloud technologies - deploy the machine-learning model and serve its outputs as restful api calls - understand the business needs to come up with the relevant features in the data set useful for data science modeling - maintain the code and libraries in code repository - work with system administration team to resolve issues install tools and libraries on the aws platform - work closely with data scientists to do efficient feature engineering for machine learning models - mitigates risk by following established procedures and monitoring controls spotting key errors and demonstrating strong ethical behavior &qualifications:&- minimum of 7 years of related experience - understanding of aws big data components and tools and knowledge in agile development cycle - experience working in a data intensive role including the extraction of data (db web api etc ) transformation and loading (etl) - experience with data cleansing preparation on hadoop apache spark ecosystem – mapreduce hive hbase spark sql - experience with distributed streaming tools like apache kafka - bachelor's degree preferred with masters or equivalent experience &leadership competencies:&- feedback: seeks feedback from others provides feedback to others in support of their development and is open and honest while dealing constructively with criticism - delegating: effectively manages tasks and people taking a practical approach to determine the most effective method of execution while respecting others’ expertise considering others’ feelings and working styles - inclusive leadership: values individuals and embraces diversity by integrating differences and promoting diversity and inclusion across teams and functions - coaching: understands and anticipates people's needs skills and abilities in order to coach motivate and empower them for success - team building: builds teams by quickly establishing relationships and drives a team identity and shared purpose based on diversity of thought skills and personalities &education: bachelor’s degreeexperience: minimum 7 yearsrelocation: no this position will not cover relocation expensestravel: n alocal preferred: yes&we are looking only for candidates willing to join us directly as w2 employees &recruiter name: pratik rajrecruiter phone: 877-884-8834 (ext: 2313)&eoe big data engineer oracle dba data modeling c++ c# pittsburgh paour client in pittsburgh pa is currently hiring for a big data engineer with the following background and technical skills:bachelor’s or master’s degree in computer science informatics or related area3-5 years of experience with large scale data technologies such as big data database technologies data engineering knowledge of informatics analytics computational scienceknowledge of toolsets and capabilities utilized by expert data scientists and modelers c++ c# sql ms sql serversoftware designdata modeling and elt etlbig data engineer oracle dba data modeling c++ c# - 15993 a rare opportunity to leverage your vision energy and acumen to shape business decisions and analytics with senior leaders in the fast-paced growing wireless industry your experience leadership and ambition will help transform an data architecture and development team servicing the intelligence needs of a multi-billion dollar strategic organization you will be a key player in supporting coverage expansion strategic planning projects and enterprise procurement among others which are focused on t-mobile's aggressive coverage and retail expansion you'll do this by partnering with data scientists analysts data architects and developers within our team and across the enterprise you should be a super technical individual passionate about data and solution oriented a successful person in this role will be able to handle a mixture of project management solution architecture and some development within our department data warehouse both on-premise and in the cloud responsibilities• support development efforts to build pipelines to data lakes databases and repositories across the enterprise and external to the company for our analytics organization • leverage your subject matter expertise data architecture to continue to scale the analytics team's data and information needs • utilize best-in-class and standard practice methodologies systems and implementations relating to data warehousing • develop custom scripts code in support of unique data blending aggregation and enrichment • serve as a liaison for capturing user stories use cases requirements and success measures from internal customers and translating that into development tasks • support analysts data scientists and leaders who conduct ad-hoc and exploratory analytics • support architecture deployment and management of our ssas cube for the continually growing self-service needs of our internal business customers • partner with multiple principal-level developers within our team to build and implement a data strategy in support of future analytics needs qualifications:• 6+ years work experience as a program management in the data architecture and warehousing space • formal project management experience certification is highly desired • has experience mentoring other members of technical development staff • expertise in microsoft ssas development a huge plus • 4+ years experience with analytics and operational data warehousing deployments: sql server ssas ssrs microsoft r server postgresql postgis oracle hadoop hana and teradata • 1+ years hands-on experience of deployments and development within amazon aws and or microsoft azure • experienced with master data management (mdm) • expert in sql t-sql visual studio ssis and at least one object oriented programming language (java c# c++ etc) • experience with jira software or equivalent• comfortable with scripting languages such as bash and python • effective communicator confident and capable of interacting directly with vp and svp-level leaders jobdiva # 18-12612 job title: big data developer (java hadoop) location: philadelphia pa duration: 12+ months the product analytics amp; behavior science team is responsible for nearly all of the product web app and stb datasets used for analyzing the product experience this means that we build datasets for the purpose of analysis and collaborate with modeling experts to build production data pipelines given the size complexity of our datasets this is not a trivial task nearly every product team member depends on this data and as a member of this team you would be at the center of product innovation this includes projects like: bull; building datasets in partnership data scientists bull; creating source datasets for a b testing bull; creating data streaming applications for machine learning applications bull; maintaining prediction apis bull; establishing new patterns for efficient processing of 100#39;s tb datasets bull; defining metrics for tracking how customers are interacting with products and service bull; partner with ui engineering teams to enable new customer experiences bull; partner with customer experience teams to create an improved experience and be the voice for customer bull; lead data driven product development the perfect person will have a background in a quantitative technical field will have rich experience working with large data sets and will have some experience in data-driven decision making skills characteristics: bull; you have at least 5 years of experience in a data-shy;driven environment designing and building distributed data processing systems bull; hands on experience in developing big data pipelines end to end bull; you have proficiency programming in python java and or scala bull; you strive to write beautiful code and you#39;re comfortable working in a variety of tech stacks bull; you#39;re self-motivated ambitious and quick to take acti big data architect (contract to hire) job location: glen allen va st louis mo or phoenix arizona also open to remote job description:amitech is seeking a big data architect to propel our company to the next level of growth this position is the architecture leader for a business unit or an operational area within the company and is a visible leader both internal and external to the client this role will be included in evaluating potential mergers and acquisitions and help drive the client to become a global leader in supporting technology to deliver better outcomes to our members must be an expert in architecture methods and technologies across multiple platforms shown ability to shape and drive strategic vision excel at inspiring motivating and creating highly productive teams expert at attracting hiring and developing a strong bench of talent possesses a unique combination of highly evolved technical skills as well as the ability to envision a strategy and be creative in order to realize ambitious goals on behalf of the business inspire the organization to create world-class solutions must be passionate about supporting the business and company vision ability to communicate a clear point of view from a business perspective in strategic discussions with executive leadership preferred skills: over 10 years of engineering and or software development experience and demonstrable architecture experience in a large organization hands-on experience in big data components frameworks such as hadoop spark storm hbase hdfs pig hive scala kafka pyscripts unix shell scriptsexperience in architecture and implementation of large and highly complex big data projectsexperience of hadoop and related technologies (cloudera hortonworks etc )experience with data integration and streaming tools used for hadoop (spark kafka etc )experience in metadata management data lineage data governance especially as related to big dataexperience with cloud platforms (aws azure) including readiness provisioning security and governanceexperience or understanding of data science and related technologies (python r sas etc )experience or understanding of artificial intelligence (ai) machine learning (ml) and applied statisticshistory of working successfully with cross-functional engineering teams education:bachelor’s degree in computer science or related fieldabout usamitech a leading healthcare analytics and strategy consulting firm leverages the true value in data to help healthcare systems and insurers lower costs improve quality of care and achieve better business outcomes reasons to partner with usamitech is a rapidly growing organization focused on our employees and we’re committed to offering opportunities to the best in the industry our diverse and innovative approach to everything we do means we’re looking for the groundbreakers and the pioneers—people who think differently and create the future lead data architect this technologically savvy pharmaceutical company is urgently seeking an experienced data architect who will be responsible for the organizations business intelligence platform the company currently has a small yet very close niche team and has a vision to significantly scale the business through technological advances over the next year this opportunity will provide an excellent foundation for career advancement responsibilities:this person will serve as the bi expert and lead all initiatives in regards to designing building testing and deploying sql reporting help develop vision for analytics through bi tools work on ad hoc projects on an as needed basis liaise with multiple areas of the business including external partners experience with the following are required: 5+ years experience in a relevant positionsql ssas qlikexperience within the pharmaceutical arenas is a plus database management experience data modeling etl data processing data architectjob location:&marina del rey cacontract to hirerole summary:our client is seeking a data architect who is a visionary in defining and managing data architecture with expertise in data modelling data marting etl performance tuning data governance and data security leveraging big data technologies columnar and time series data stores along with traditional rdbms right candidate will be a self-motivated results driven technologist &who is passionate about collaboration but can work independently and lead by example&responsibilities: define system level architecture and conduct dimension modelling & data martingmentor team members through conceptual and logical modelling and drive physical modelling on the data martsdefine data security protocols and enabling access controlsconduct database performance tuning and architect low latency data systemsextensive experience building master data management strategy in an organization build highly scalable data marts that can be used by dsc globallyresponsible for maintaining data integrity across multiple data martsbuild overall data mart architecture and design and document the data systems ecosystemdata mapping from sources to the data marts and work with peer data engineering teams to pipeline the datadesign and code for highly scalable solutions for data extractions from data lake and transformations jobs for business rules applicationdefine and parallel process the etl jobs towards low latency and highly scalable systemsarchitect detailed design and code for data quality frameworks that can measure and maintain data completeness data integrity and data validity between interfacing systemsdocumenting data mapping and maintain a data dictionary across all dsc enterprise dataowning the kpis to measure the performance of data marts and provide visibility to senior managementdesign for self serve bi platforms and drive higher adoption ratequalifications:master’s degree in computer science data science or related majorsminimum 10 years of industry experience overall10+ years of data warehousing and data architecture with 8 + years of data modelling and data processing for large scale near real time big data platforms like redshift hbase druid snowflake8+ years of architecting end to end self serve bi platforms using bi tools like tableau qlik sense looker or like is preferred8+ years of etl knowledge and parallel processing technologies like spark and kafka streaming 5+ years of programming experience with either java or python or c c++ &in a linux unix environmentminimum 2 years of working knowledge with cloud based solutions hosted on awsconfluence github jira are other tools and technologies preferred&applicants must be authorized to work in the u s please apply directly to by clicking 'click here to apply' with your word resume!looking forward to receiving your resume and going over the position in more detail with you the platinum team!platinum is proud to be an equal opportunity employer!all qualified applicants will receive consideration for employment without regard to race color religion sex national origin disability protected veteran status or any other characteristic protected by law your right to work - in compliance with federal law all persons hired will be required to verify identity and eligibility to work in the united states and to complete the required employment eligibility verification document form upon hire copyright © 2006 - 2018 platinum esol inc all rights reserved does data speak to you? can you understand business processes and identify business needs by looking at patterns in data? does talking data with people excite you? if the answer is yes then we want to talk to you! sevatec is the preeminent mid-tier agile-focused company serving the national security sector we are seeking a data sciences solutions architect within our chief technology officer’s innovations team to serve as the corporate thought leader across our data science and business intelligence technology area sevatec’s data science solutions architect will play an integral role in executing our strategic plan in all areas of data and analytics across organic growth operational delivery and technology advancement with organic growth you will have a leadership role working with business development and operations in developing technology roadmaps intellectual property investment strategies and complex technical responses to customer requirements resulting in innovative and winning solutions for sevatec’s federal clients in operations you will directly support project start-ups critical executions and implement innovative solutions to advance our customer’s mission while on contract the solutions architect mentors key staff and helps transfer knowledge of new technologies across the company’s portfolio of programs technology advancement consists of institutionalizing that program knowledge within the firm developing discriminating intellectual property maintaining communities of data science and business intelligence subject matter experts (sme) creating training materials to ensure consistent proficiency throughout the company and projecting sevatec’s thought leadership within the technology community through speaking engagements and social media coordinate with capture and proposal managers to create proposal strategies for developing discriminating win themes to address customers’ business challenges throughout the business development lifecycle to include sources sought requests for information and proposal responses identify and manage smes technical writers and consultants to address the requirements instructions and evaluation criteria set forth by the solicitation and in sevatec’s requirements-driven outline research create and review information in support of management approaches technical approaches and past performance volumes own the complex data analytics technical solutions integrated master schedules and work breakdown structure during the proposal lifecycle and serve as the lead technical author in creating winning solutions inclusive of color team reviews interface with illustrators to conceptualize graphics figures tables and other presentation techniques to optimize the presentation of information lead the development of proposal demos briefings and oral presentations which illustrate sevatec’s technical depth and innovations in the areas of data and data sciences support the overarching pricing strategy formation of staffing plans and development of the basis of estimates that align to the technical and management approaches model and frame business scenarios as a member of the delivery team on current programs that are meaningful and impact critical business processes and or decisions identify business metrics to baseline program performance and identify improvement results identify the available and relevant data including internal and external data sources leveraging new data collection processes as appropriate design innovative and effective data architectures to solve existing customer’s analytics problems; communicate methodologies used and results achieve in key meetings with customers research design and advocate new technologies architectures and data analytics products for federal customers; create prototypes and proof-of-concepts and training materials maintaining communities of data science and business intelligence smes project thought leadership within the technology community through speaking engagements and social media bachelor’s degree in engineering computer science or similar preferred must have 10+ years of experience with 5+ years of recent experience as a solution architect and technical proposal leader on multiple opportunities ranging from $10m - $250m more than 5 years of relevant quantitative and qualitative research and analytics experience with proficiency in statistical analysis quantitative analytics forecasting predictive analytics multivariate testing and optimization algorithms demonstrated ability to grasp new concepts develop solutions from loosely defined business problems by leveraging pattern detection over large datasets and write about complex subject matter proven data analysis and sql skills extensive experience with data warehousing data mining and visualization strong programming skills in open-source (e g hadoop mapreduce or other big data frameworks java) commercial software products (e g obiee tableau) and statistical modeling (e g sas or r) must have exceptional written and verbal communication skills ability to both work independently and function effectively as a member of a team with a diverse range of skills experience and personalities excellent time management and organization skills handling multiple simultaneous projects with stringent deadlines job description requirements:· very strong engineering skills· should have an analytical approach and have good programming skills· provide business insights while leveraging internal tools and systems databases and industry data· minimum 5+ years’ experience· experience in retail business will be a plus· excellent written and verbal communication skills for varied audiences on engineering subject matter· ability to document requirements data lineage subject matter in both business and technical terminology· guide and learn from other team members· demonstrated ability to transform business requirements to code specific analytical reports and tools· will involve coding analytical modeling root cause analysis investigation debugging testing and collaboration with the business partners product managers and other engineering teams· must have strong analytical background· self-starter· must be able to reach out to others and thrive in a fast-paced environment· strong background in transforming big data into business insightstechnical requirements:· knowledge experience on teradata physical design and implementation teradata sql performance optimization· experience with teradata tools and utilities (fastload multiload bteq fastexport)· advanced sql (preferably teradata)· experience working with large data sets experience working with distributed computing (mapreduce hadoop hive pig apache spark etc )· strong hadoop scripting skills to process petabytes of data· experience in unix linux shell scripting or similar programming scripting knowledge· experience in etl processes· real time data ingestion (kafka)nice to haves:· development experience with java scala flume python· cassandra· automic scheduler· r r studio sas experience a plus· presto· hbase· tableau or similar reporting dash boarding tool· modeling and data science background· retail industry background business intelligence – data warehouse architect (consultant)tracking code303502-616job descriptionposition summary:the business intelligence data warehouse architect will be responsible for the planning design management and support of company-wide business intelligence (bi) data warehouse (dw) initiatives these responsibilities include: a business intelligence tool system implementation planning and execution of projects and enhancements (lifecycle) and the management of day-to-day production system operations the function of the bi dw architect is to understand needs drive requirements propose solutions and manage the delivery of multiple cross-functional data reporting and analytics projects and integrations between different systems additionally the architect will be required to present a strong understanding of bi dw architecture concepts possess hands-on experience with bi tools and technology and be able to serve as the primary subject matter expert and guide team members summary of duties and responsibilities:generalserve as the technical expert responsible for the architecture design and implementation of bi dw solutions with complete and accurate information information data delivery to business users using maintainable systematic and automated procedures make recommendations and guidance about data collection methods master data management concepts metrics data definitions and data evaluation methods in collaboration with internal and external partners evaluate create and deliver on data needs for assigned projects ensuing data integrity help establish and define data validation data cleansing data integration and transformation practices facilitate data reporting analytics design review sessions testing cycles validation and other post-deployment activities lead exercises around data architecture standards and documentationmap configurations and complex data architectures ensuring documentation meets current and forecasted needs business relationship managementestablish close partnerships within the it department and with the business teams to provide data strategy expertise ensuing overall data quality within all enterprise systems including interfaces between applications systems work with the business teams to gather requirements for data and reporting enhancements change requests and new functionality stay abreast of contemporary technology trends concepts and serve as an sme for the business teams coach staff on bi dw best practicessystem maintenance and supporthelp to facilitate unit system integration and user acceptance test plan development and execution including the creation of test scenarios scripts test data setup and verification validation of test results manage issues and bugs within the system using tracking support systems; liaise with internal and external resources to facilitate resolution and closure serve as the main point of escalation for issues requiring external vendor support vendor managementwork with vendor to validate and refine technical specifications against business requirements communicate with external vendors and other departments to resolve production issues upgrades and software evaluations prepare for and participate in vendor relationship activities and stewardship meetings on an ongoing basis performs other related duties and assignments as requiredrequired experienceeducation:bachelor degree in information technology mis computer science data science or related work experience minimum qualifications:minimum 8 years’ of proven experience with business intelligence and data warehousing with increasing responsibilities experience serving in fashion apparel retail or related environment preferred minimum 6+ years’ practical hands-on experience with two or more of the following areas of bi dw: reporting etl data analysis data modeling architecture development architecture information data delivery bi dw quality assurance master data management bi dw solution architecture or bi dw infrastructure proven experience with database normalization data validation data hygiene data cleansing data consolidation data enrichment and daas strong and adaptive interpersonal skills with the ability to operate effectively with multiple distinct business groups and levels within the organization excellent problem-solving skills with the ability to make effective and timely decisions experience designing implementing growing and supporting business intelligence and data warehouse systems and competencies experience leading projects with 10+ resources; including a mix of internal and external a minimum of 4 years managing business intelligence and data warehouse projects proven lifecycle solution development from requirements gathering through build test deploy and support ability to write clear specifications and requirements must have excellent communication skills both written and verbal with a strong inclination to follow-up communicate through ambiguity and function with a service orientation detail and task oriented with a sense of urgency and the ability to reprioritize as needed with minimal direction ability to multitask in a collaborative environment partnering with individuals from various departments strong analytical skills with demonstrated ability to analyze and make recommendations on solutions skills required:proven project management skills in the area of data integration etl reporting data warehousing and business intelligence projects experience working with teams comprised of consultants vendors and full-time employees exceptional attention to detail exceptional oral and written communication skills including working with business leaders and presenting to it leadership and company executivesstrong analytic skills pertinent to business intelligence and data warehousing experience working with agile waterfall (or related) project methodologies and the ability to determine when each is best used formal training in itil project management business analysis and or process modeling preferred locations: remoteduration: 12 monthsresponsibilities:our client needs a strong sr big data engineer architect to support an ongoing project this position is part time probably totaling around 1000 hours or so in 2018 some weeks will be heavier than others this position is 100% remote targeting someone with a rhcse and a fundamental big data background the most important skills are listed below:a solid understanding of linux is essential rhcse preferredsupporting coudera data science workbench requires hadooop administration experience with secure clusters along with understanding of docker supporting the dgx-1 gpu systems requires an understanding of gpu technologies and software tools ideally the candidate can work well with data science teams to identify system environment changes needed for evolving workloads regards megha ananthakrishnatechnical resource manager | matchpoint solutions | office 925-829-7755 |email megha@matchps com title: sr big data engineer (hadoop)location: cincinnati ohconexess group is a staffing company that specializes in finding the right talent for our clients and connecting people with new opportunities our client is expanding its big data analytics group and is looking for high-level talent to join them in this high-priority effort the big data engineer will work with other highly talented data engineers data scientists and bi professionals and deal with some of the most advanced data solutions to engineer new analytics tools that deliver never-before-seen industry insights job summarythe big data engineer performs advanced data modeling and optimization of data and analytics solutions at scale they are a subject matter expert in data management and data access (big data and traditional data marts) advanced in programming advanced in database modeling and familiar with analytic algorithms and applications additional responsibilitiesthe big data engineer…will leverage data types and various data models to enable a range of analytic solutionswill leverage technologies for managing and manipulating data scaling data models and solutions to support analytics that drive business insights across the organizationrequirements 3-5 years of data engineering experience utilizing the hadoop platformadvanced coding experience skills with the following:linux shell scriptingpython or r language or javaapache sparkdemonstrated mastery in applied big data technologies and tools such as: oozie airflow kafka knime hive pig impalaexperience with nosql databasesproficiency with data modeling and solution scalingaws and or azure framework (preferred)working knowledge of packaging build tools such as sbt apache maven or ant (preferred)3-5 years of relevant experience working for large organizations (highly preferred)understanding of data mining data modeling and data provisioning (acquisition transformation and sharing)strong written and verbal communication skills leadership skills and self-starter mentalityability to handle multiple priorities and work collaboratively across functionsjob perksexcellent compensationexcellent full-time benefitsideal opportunity for someone that enjoys taking on and being exposed to new challengesopportunity to work for a prominent global organization on some of most advanced data solutions industry-wideorganization will pay relocation if needed our client a focused and highly accomplished group of professionals are seeking multiple top secret cleared data engineering professionals across multiple data management disciplines this is a ground floor opportunity to join our client on a contract at its inception with a base period of one year four one-year option periods through 2022 and a total ceiling of $90 million if all options are exercised these positions are well paid offer exceptional benefits genuine family friendly leadership and can be positioned in multiple dc metro locations; joint base anacostia bolling (jbab) washington dc college park md reston va or charlottesville va technically we are looking for data engineers & analysts with experience in- data management- data governance- data architecture- data security- metadata management- digital data management- data library sciencea partial list of data engineering projects involved include:data source analysis data modeling data life-cycle strategy data mining data warehouse data structure data policy data exploitation data analytics big data artificial intelligence (ai) algorithms machine learning (ml) deep learning (dl) neural networks natural language processing (nlp) statistics semantic web ontology taxonomy data strategic planning anomaly detection - plus many many more required skills clearance required: - must be a u s citizen and possess a current and active ts sci clearance granted by the department of defense or an intelligence community- alternatively can accept a ts sci eligible candidates if investigation was conducted in last 6 years from dia dod or intel communities only must be able to pass a counterintelligence (ci) polygraph the ci poly will be conducted within a few months after starting employment education: bachelor or master degree (preferred) in computer data science or mathematicsexperience: five (5) years of working with data on military or intelligence projectsexperience and a thorough understanding in as many of the technologies and frameworks noted possible: master data management reference data management metadata management digital data management data library science data model data modeling data quality data steward collibra data governance center data management data management association (dama) dmbok data strategy data strategic planning entity extraction graph database python r java xml xslt scala sci-kit learn tensorflow apache mxnet theano keras spark caffe open nlp pandas random forest logistic regression arim knn algorithm computer vision pattern recognition classification models microsoft cognitive toolkit (cntk) govcloud commercial cloud computing (c2s) azure amazon web services (aws) are you interested in learning more? do you have a top secret sci or sci eligible? are you a data professional with a background or working with the military or the intel community? if so contact elite technical for immediate consideration thank you in advance we are seeking a highly motivated individual with hands on technical skills in harnessing data into insights; leadership skills to manage and lead a technical data team; and experience in building business intelligence data systems in a b2b setup the manager data intelligence is responsible for developing executing and managing the it systems that support cr’s data intelligence platform partners with the it leadership team to help set technology strategic objectives assess select and oversee implementation of necessary systems as well as manage external partners manages and leads a team in building business intelligence data systems works with multiple cross functional teams (digital product teams it content devops etc) analyze the requirements translate them into detailed technical specifications and architect build the systems including databases data pipelines etls and analytical reports & dashboards using visualization tools bachelor’s or master’s degree in computer science or other related field and a minimum of 7 years of experience leading data technology programs experience at a market research business intelligence or b2b publishing organization is plus experience architecting and building data visualization reporting and business intelligence systems advanced knowledge of data architecture and engineering expert level knowledge in data modeling mining visualization scaling and performance tuning of high volume oltp olap and data warehouse environments strong knowledge and hands-on experience in on or more programming languages and frameworks (java jee (preferred) spring python flask)strong knowledge and hands-on experience in developing web services (rest apis)strong knowledge and hands-on experience working with one or more databases (oracle mysql mongodb redshift )strong knowledge and hands-on experience one or more data visualization tools (tableau (preferred) qlikview plotly d3 etc)hands-on experience working in cloud infrastructure (aws)strong organizational administrative communication and collaboration skills (team leading managing capabilities)beneficial but not required are exposure to big data ecosystems (hadoop spark etc) understanding of data science and machine and experience with front-end technologies (html css javascript angularjs react etc)experience working with agile lean product development approaches as well as cooperatively working in a devops culture experience working with modern tools in the agile software development life cycle - version control systems (ex git github stash bitbucket) knowledge management (ex confluence google docs) development workflow (ex jira) continuous integration (ex bamboo) real time collaboration (ex hipchat slack)superior written and verbal communication skills prior experience managing staff and leading cross-functional teams proven experience in managing agencies and 3rd party vendors assesses recommends and manages it systems supporting data intelligence platform from discovery through deploymentalong with the it leadership serves as a driver of the technology needed to deliver and continuously evolve cr’s data intelligence program partners with cross functional teams at cr to understand business requirements define deliverables and recommend system improvements to deliver best-in-class analytics reporting and data visualization collaborates and coordinates development qa deployment and support activities involving data intelligence platform creates and presents proof of concepts (pocs) to leadership as per data intelligence product team and overall it data architecture needs responsible for hiring managing developing and evaluating team’s technical skill set; coaches and mentors more junior staff performs active hands-on development coding activities as well as leading engineers with technical guidance maintains knowledge of and evaluates emerging technology to educate it and other divisions on opportunities identifies and evaluates potential data vendors the data services team at hearst part of corporate technology is responsible for a number of big data sources that power products used throughout the organization the database administrator will be a key member of the team having responsibility for the design and maintenance of key pieces of the hearst data warehouse we ingest many terabytes of data each day from a wide variety of internal and external sources; virtually all of that data is stored in databases at some point the dba will take ownership of a number of amazon redshift clusters our google bigquery implementation as well as other smaller databases on a variety of platforms and will ensure their integrity and resilience skills & requirements3+ years experience with amazon redshift; experience with mysql is a plus experience with google bigquery is a big plusaws management and usage is required must include significant experience with database design and optimization including writing and executing ddl statementsunderstanding of database performance and tuning must have a solid understanding of redshift distribution and sort keys and column encodings strong working knowledge of aws – must have experience spinning up and resizing redshift clusters be familiar with rds and know how to install a database on an ec2 instancedemonstrable experience with issue detection and resolution; be ready to provide examples working knowledge of backup and recovery proceduresability to provide guidance to data scientists engineers and other team membersdocumentation skills for processes and proceduresexperience with nosql databases such as cassandra dynamodb and mongodb is a big plusworking knowledge of linuxcomfort with python bash scripting and crons job summaryvydia’s data science efforts are an integral part of the company’s success through our data we offer insights to our artists creators partners and internal users our elt workflows simultaneously promote quick analysis and richer complex investigations our data warehouse supports both data science and business intelligence one-third of our data is well-structured with the remaining being mostly semi-structured and some fully unstructured we have terabytes of data currently doubling every 5 3 months responsibilities and dutiesas a data engineer you will own vydia’s multitude of data pipelines you will design and implement our elt workflows which originate at partner apis and conclude in our data warehouse you will work on closely with our data science business intelligence and product teams in determining current and future needs as part of the data engineering team your responsibilities will include:ensuring the availability and timely delivery of data company-widemodeling new data setsdesign of all new elt workflows and pipelinesown the orchestration of the workflows and contribute strongly to infrastructure decisionsmonitoring and improving on existing pipelines and oversight of our elt workflowmaintaining a single version of truth for our dataworking with others to implement continuous integration (ci) data quality testsmentoring and guiding your junior colleaguesbeing a thought leader with respect to the company's data strategytechnologies :we use aws s3 & ec2 extensively and we will be implementing airflow our current dw is on redshift our app relies mostly on postgres we use looker in-house for bi product engineers work mostly in ruby and python our data scientists work in r we are not married to any tool or technology and would expect that our ideal candidate will not be either qualifications and skillsabout you :we want to learn more about you if you are the ideal candidate for this role let us know in our mind being a perfect fit means that you have the necessary hard skills and expertise and the complimentary soft skills you are a python proyou have several years’ experience wrangling and data you love apis when you encounter a new one you study it inside and out you learn every corner of it as though you designed it yourself you intuitively know how to extract value and insights from dataworking with deeply-nested complex json is a fun day at the office for you you can articulate the merits and pitfalls of the different approaches in designing a pipeline you are passionate about data quality control and know how and where to anticipate potential errors working “in the cloud” is not a point of distinction for you it is a given you understand what it means to work at a tech startup hopefully this is what excites you more than anything else about working here you love the idea of building the data scene in nj and being a leader in this community you have orchestrated workflows using airflow and are familiar with the challenges and how to overcome them critically you are a person who thinks in data you relate the real world to data and your data to the real world you understand that data is not the end goal but a vehicle to help get us where we are going and you see your role as the person most critical in making that happen technical requirements• knowledge experience on teradata physical design and implementation teradata sql performance optimization• experience with teradata tools and utilities (fastload multiload bteq fastexport)• advanced sql (preferably teradata)• experience working with large data sets experience working with distributed computing (mapreduce hadoop hive pig apache spark etc ) • strong hadoop scripting skills to process petabytes of data• experience in unix linux shell scripting or similar programming scripting knowledge• experience in etl processes• real time data ingestion (kafka)• development experience with java scala flume python• cassandra• automic scheduler• r r studio sas experience a plus• presto• hbase• tableau or similar reporting dash boarding tool• modeling and data science background• retail industry background director of dataresponsibility and ownership for consolidated business intelligence data ecosystem data types include audience advertising financials and hr the role is responsible for the strategy solution design team leadership and roadmap management of 30+ data sources you will build and scale a foundation of good data to service the needs of internal external reporting advanced analytics data science machine learning and production modeling responsibilities:understand data needs across all lines of business and functions establishing technical methods to leverage data across basic to advanced decisions and data flows assess and launch machine learning and data science infrastructure paving the way to inform the production of purposeful content improve our marketing return and increasing audience satisfaction with our products own and commit to decisions how we design build and launch data platform to scale for future initiativesstrategize and maintain a conceptual big picture which guides the logical and physical data models that are capable of supporting existing and future data needsperform data profiling of legacy and strategic data source systems to assess data quality and identify issues inconsistencies opportunities missing data recommend opportunities to decrease the data ecosystem cost and increase its value develop protocols and governance that promote data standards and data quality across the business ensure workflows and solutions are designed and enforced to maintain compliance data privacy legal compliance (gdpr internal external data security) and data pipeline uptime (sla solutions alerting documentation)requirements: experience with near data dwh and data lake architecture including full stack technical designs in aws azureprior management of multiple project streams that balance infrastructure maintenance business requirements and exploratory tasks with incremental value6-8 years experience in dwh design management strategy3 years+ in a team leadership roleexcited to enter into an early organization within a fast growing and evolving company data science big data and automation are helping reinvent the healthcare system in an effort to help people live heathier lives and help make the healthcare system work better for everyone the advanced research and analytics (ara) organization of unitedhealthcare (uhc) delivers analytic services and solutions which support all aspects of uhc operations from intelligent claims adjudication identification of fraud waste abuse improved member provider experience clinical intelligence and more we deliver machine learning artificial intelligence and big data to turn healthcare data into information that improves the lives of our members providers and the healthcare systems as a whole the senior director of technology - data science & big data is responsible for defining and leading a technology strategy that advances machine learning automation and big data capabilities across unitedhealthcare the position will lead a team which includes a director of data & security and a director of application development & support together the technology team is responsible for the big data environment big data development analytic application platform application development & integration and solution support ara drives analytic technology innovation analytic r&d analytic solution development and pilots ara then works with it to transition solutions to enterprise production and support as such this position will manage a tight alignment with uhc it (serviced by optum technology) and the uhc cios for the various lines of business within unitedhealthcare daily responsibilities include continued innovation and development of our analytics automation and big data platforms development and execution of our data strategy managing internal engagements meeting project and deliverable timelines ensuring the accurate and timely completion of deliverables additional responsibilities involve working with analytics directors on staffing needs strategic technology trend identification monitoring project progress and completion issue resolution and technical solutions related to data data flow and advising on technology needs a strong candidate would have experience leading teams who apply advanced analytics technology (supporting predictive analytics machine learning and artificial intelligence) big data technology big data modeling and development and automation technology the successful candidate will have experience in application development analytical infrastructure development and business solution consulting experience in the medical or pharmacy policy space fraud-waste-abuse detection marketing analytics and healthcare economics is also a plus you'll enjoy the flexibility to telecommute* from anywhere within the u s as you take on some tough challenges primary responsibilities: management of the infrastructure that supports ara (provisioned via platform as a service from optum technology) define and execute our big data strategy including design and development of the data environment required to support analytics for both batch and real-time analytics define and execute processes to help ensure and enforce the security of our data while enabling agile development and access of the data by business users power users and developers define and develop and analytic platform which provides the software needed to explore discover and develop production analytic solutions which integrate with business applications define and foster best practices in software design and code management define and execute a tiered support model for analytic solutions working with business partners and it support analytic technology innovation and promote a culture of patents papers and internal collaboration work with ara portfolio directors to develop data onboarding roadmap as well as data archiving monitor and manage the efficient operations of our data users and solutions required qualifications: undergraduate degree 7+ years of experience in the development of enterprise solutions in the areas of advanced analytics (predictive analytics machine learning artificial intelligence) 7+ years of big data (technology and data strategy) 7+ years of experience leading technology teams experience with analytical platform data management platform integration experience with open source analytic platforms (r python scala etc ) and commercial platforms (sas spss azure etc ) experience delivering analytic solutions on the hadoop stack (mapreduce sqoop pig hive hbase flume) experience integrating analytic models within business applications for real-time scoring knowledge of nosql platforms (e g mongodb couchbase marklogic etc ) demonstrated experience consulting with executive management on technical solutions demonstrated experience setting and executing a technology strategy ability to navigate and drive results in a matrixed environment ability to document and communicate complex technical concepts effective in fast-paced team-oriented environment preferred qualifications: master’s degree in computer science or related field demonstrated experience with infrastructure planning scaling and administration including demonstrated experience with big data platforms experience with healthcare data specifically in medicare and medicaid analytics provider and consumer analytics and fraud waste and abuse (fwa) analytics careers with unitedhealthcare let's talk about opportunity start with a fortune 6 organization that's serving more than 85 million people already and building the industry's singular reputation for bold ideas and impeccable execution now add your energy your passion for excellence your near-obsession with driving change for the better get the picture? unitedhealthcare is serving employers and individuals states and communities military families and veterans where ever they're found across the globe we bring them the resources of an industry leader and a commitment to improve their lives that's second to none this is no small opportunity it's where you can do your life's best work (sm) *all telecommuters will be required to adhere to unitedhealth group's telecommuter policy diversity creates a healthier atmosphere: unitedhealth group is an equal employment opportunity affirmative action employer and all qualified applicants will receive consideration for employment without regard to race color religion sex age national origin protected veteran status disability status sexual orientation gender identity or expression marital status genetic information or any other characteristic protected by law unitedhealth group is a drug-free workplace candidates are required to pass a drug test before beginning employment job keywords: real time analytics predictive analytics artificial intelligence data science machine learning big data hadoop sas nosql python hive pig hbase flume telecommute unitedhealth group uhg telecommute remote work from home 7b639559-3459-48db-931a-0a29426ab763*senior director of technology for data science & big data - telecommute**minnesota-minnetonka**756467* superior group is looking for senior manager - it analytics for our client located in buffalo grove il-lead a team of data engineers to execute internal and external projects-proactively drive the vision for data warehousing across the company and define and execute on a plan to achieve that vision-define the processes needed to achieve operational excellence in all areas including project management and system reliability-build a high-quality data warehousing team and design the team to scale-build cross functional relationships with business stakeholders data scientists and software engineers to understand data needs and deliver on those needs-manage data warehouse plans across the company drive the design building and launching of new data models and data pipelines in production-manage development of data resources and support new data needs -drive data quality across the company-define and manage sla’s for all data set and data warehouse processes running in production-effectively manage business clients ensure a high level of quality and satisfaction in delivered services-work with database tools or scripting languages to process data from a variety of sources data sources may include ecommerce web analytics marketing crm and other data sets-develop and perform quality checks for both ad hoc and ongoing processes monitor and maintain these automated processes-establish monitors and alerts for pro-active monitoring of data processing performance-strategic data management knowledge including: data warehousing data governance data architecture data asset development data quality meta data management reporting and analytics and infrastructurerequired skills qualifications: -bachelor’s degree in computer science math physics-minimum of 6 years’ experience as senior manager - it analytics in data warehousing-minimum of 3 years’ experience in a sql python e-commerce -minimum of 3 years’ experience with aws redshift and google bigquerypreferred skills qualifications:-experience in a leading strategic design delivery and governance of major cross functional business efforts-experience in a designing and building export transform and load (etl) processes-experience with enterprise metadata management and industry standards-experience in a operationalizing data governance data stewardship and data quality-working knowledge of multiple technology systems and data management tools-expertise in database and data warehouse migrations-ability to understand complex issues and develop meaningful analysis and recommendations to line of business and executive management-effective communication skills able to support independent viewpoints to executive management-polished oral communication-strong writing abilities and experience with writing a variety of communication pieces-strong customer service-work under tight deadlines and be adaptable to changing assignments-manage multiple assignments-strong attention to technical detail-partner with professionals consultants stakeholders and staff with sensitivity to their needs and priorities-negotiation to find mutually acceptable solutions; building consensus through give and take-effective interpersonal and relationship building-working effectively in both independent and team situations-time and project management skillsadditional information:-upon offer of employment the individual will be subject to a background check and a drug screen go beyond www superiorjobs com eeo employer - minorities females disabled veterans sexual orientation gender identity as a principal data solutions architect on our data platform team you’ll work with fun brilliant people in a start-up environment to create irobot’s next generation data-driven strategy we are seeking a special big data orientated architect to take a leadership role in building our next generation cloud-based data platform we are seeking someone with deep expertise and real world experience in implementing leading edge data architecture concepts capabilities and technologies to enable large scale data ingestion flexible data lake constructs data enrichment api capabilities with a focus on maintaining security and privacy the platform will drive robot intelligence performance in the real world personalization deep learning new digital capabilities and the backbone of many future irobot offerings you will be part of the formation of the new platform team drive progress of the data platform and you’ll have leadership over technical direction for future data-related efforts we are seeking someone who can accelerate development through newer technologies and can rethink the usage of existing technologieswe have a saying that it doesn’t count unless its real so if you’re the “go to” person for cutting edge big data solutions and have the battle scars and victory banners to prove it then we want to work with you! design & implement next generation platform – create a brand new global leading edge cloud-based data architecture for real time real world robot consumer and digital applicationsthought leadership -­ partner with solution architects data science and software engineering teams to provide thought leadership on cloud-based data solutionsapplied logic - develop a deep understanding of irobot’s business model goals robot and consumer needs and data-based possibilities and create awe-inspiring design solutions platform & teamcost effective engineering – engage with irobot and aws management and technical staff to appropriately understand and recommend solutions that balance aws costs against the resulting technical benefits advise - advisory role to executive engineering data and product leadership (dir vps & svps) and ability to influence key digital strategieshands-on participation - hands on participation in the full life-cycle of definition design implementation testing and support demonstrable track record of using cutting edge technology for real world successdeep experience designing implementing and supporting large-scale global data solutions leveraging aws infrastructure and servicesstrong experience with cloud-based big data platforms and technologiesextensive programming experience in scripting and programming languages such as java python r ruby go bash etc deep understanding of cost model(s) and cost-conscious design principlesability to create clear and detailed technical diagrams and documentation ability to roll up your sleeves and implement alongside the other engineers being coached and developedexperience developing and maintaining global data privacy and security architectures a plusbs in computer science or related field with 7+ years of experiencewe are the leading global consumer robot company designing and building robots that empower people to do more both inside and outside of the home founded by mit roboticists who had the vision of making practical robots a reality to date we have sold over 20 million robots and globally employ more than 600 of the robot industry’s best and brightest irobot is committed to fostering invention discovery and technological exploration in the pursuit of practical and valuable robot products for the home irobot stock trades on the nasdaq stock market under the ticker symbol irbt irobot is headquartered in bedford massachusetts accessible by our corporate shuttle directly from alewife station we also have offices in california europe japan china & hong kong imagine the future you could help us build as a fellow iroboteer! check out #lifeatirobot and follow us on instagram: @irobotcareersirobot is an equal opportunity employer overview:based in northern va axiologic solutions llc has opportunities for you to become part of our high-quality team that delivers innovative solutions to key federal clients we are currently seeking a data sme to provide expertise in it project portfolio enterprise data architecture projects in dod and intelligence community (ic) responsibility:evaluate and integrate recommender service technology for use within the pia information technology environment in order to increase the efficiency of intelligence analysts accessing and evaluating large amounts of content coordinate with academia and the national labs to evaluate and evolve recommender technologies for use within pia's environment the state of the outcome of the research needs to be evaluated for maturity with respect to readiness for operational use in general and applicability to pia's data and technology environment specifically integrate a recommender technology service into pia analysts technical environment delivering efficiency to analytical processes by automatically identifying those items that are likely to be of highest value for each individual analyst integration will involve combination of software procurement software engineering and or technology integration involve analysts throughout the process to ensure that the design and implementation meets the needs of the users support the client with comprehending the context of its program data by recommending qualitative and quantitative relationships including patterns and trends from large amounts of data and providing analytic support to help inform policy rational decision making and resource allocation provide it strategic guidance coordination of project management and communications support efforts across a matrixed organization provides functional data management support to include the effective execution of master data management master mata maintenance and data governance practices and techniques provide guidance to the customer on best-practicesother tasks as assigned qualification:must have an active current ts sci and ci poly approximately 10 years of experience in information systems development focused on processing large volume near-real time data feeds to meet data analytics and security requirements bachelors degree in computer science information systems management mathematics engineering or other relevant discipline demonstrated work experience in architecting new system solutions using current technologies as well as transitioning existing systems into modern technologies without negatively impacting operational and compliance requirements demonstrated experience performing data assessment data engineering modeling and analytics to enable new methodologies for end user analysts data scientists etc experience cloud technologies aws c2s containers data layers micro-services system administration and sql nsql database experience demonstrated experience as a sme to prioritize and meet tactical and strategic requirements for frameworks and systems to process data demonstrated experience in information systems development across the it lifecycle with a focus on systems to perform high volume and velocity data processing on disparate data types formats demonstrated experience delivering automated and scalable data process monitoring and data quality systems and processes we are proud of our diverse environment eoe m f disability vet sexual orientation gender identity job posted by applicantpro purpose of role organizational unit: this is a senior position that requires extensive knowledge of big data tool administration data management related technologies integration this organizational unit is for data science analytics technology management and operational services delivery of the various elements of the big data and analytics data integration business intelligence management in this capacity this senior big data specialist role is responsible for infrastructure architecture design and their implementation for big data services using existing enterprise data assets (hadoop data lake data warehouse and master data management) functions performed:the following functions are the responsibility of the senior big data specialist within data management team this individual will be mainly responsible to manage configure and maintain large scale multi-tenant cloudera hadoop cluster environments perform the performance tuning and code migrations from dev to qa and production environment within data lake also subject matter expert for other data integration tool management and governance such as informatica etc;position responsibilities• implementing managing and providing support for large cloudera hadoop clusters across all environments (dev qa & production)• subject matter expertise supporting and governing other data integration tools in the environment • working with multiple teams and colleagues at every level of the organization • managing large-scale infrastructure projects and working experience red hat enterprise linux 6 and 7 systems• cloudera administration experience working with - hdfs yarn zookeeper map reduce spark impala hue oozie sqoop kafka hive and kudu • deploying & maintaining hadoop cluster also de-commission & commission of nodes using cloudera • configuring the name node high availability and keeping a track of all the running hadoop jobs • takes care of the day-to-day running of hadoop clusters • work closely with the infrastructure team database team network team bi team and application teams to make sure that all the big data applications are highly available and performing as expected • responsible for capacity planning and estimating the requirements for lowering or increasing the capacity of the hadoop cluster • responsible for deciding the size of the hadoop cluster based on the data to be stored in hdfs • performing backup and recovery using cloudera bdr tool • enabling snapshot backups and recovery the point in time recovery using cloudera • handle all hadoop environment builds including design security capacity planning cluster setup performance tuning and ongoing monitoring • perform high-level day-to-day operational maintenance support and upgrades for the cloudera hadoop cluster • research and recommend innovative and where possible automated approaches for system administration tasks • creation of key performance metrics measuring the utilization performance and overall health of the cluster • deploy new upgraded hardware and software releases and establish proper communication channels • ability to collaborate with product managers lead engineers and data scientists on all facets of the hadoop eco-system • ensure existing data information assets are secure and adhering to a best in class security model using • troubleshooting application errors and ensuring that they do not occur again skills required:• bachelor’s degree in computer science information systems or equivalent• minimum of 8 years in data tools and with focus in quality and reliability in design• required - cloudera hadoop experience expertise in cloudera hadoop 5 10 above • preferred - 2+ years’ experience in data lake design and implementation educational requirements• experience with cloudera hadoop distribution (hive hbase spark)• experience with data integration and transformation software and data warehouse master data• creation of complex parallel loads and dependency creation using work flows• must have experience in installing and configuring cdh & cm version 5 10 above• experience in real-time analytics unique competencies• expertise in scala java python spark perl shell programming and other big data development technologies• preferred - data lake etls skills and informatica skills• preferred - experience in web services we are a leading global online software development company looking for a highly skilled senior data science engineer to join our growing team our company is committed to helping entrepreneurs grow their businesses and giving online users the chance to create their perfect experiences and we are looking for someone who shares that passion the senior data science engineer will be a key player in helping us build an advanced platform for optimal consumer use responsibilities include: using exceptional architectural skills to design implement and troubleshoot high volume recommendation systemsworking closely with the product management user experience and engineering teams to build a highly optimized platform for customerspaying close attention to details and seeing the big picture while in the building stages to ensure that you reach the desired outcome candidate requirements:b s or m s in cs or an advanced understanding and work experience in computer science and software development5 or more years of programming experiencestrong experience with java services (development debugging profiling & load-testing)experience with machine learning and large data frameworks such as hadoop and spark a plus great opportunity to work for a great organization as a enterprise data architect on a senior director's level in the westchester new york area the incumbent must be able to understand and navigate the business side understanding how the data is connected to the product and services how the product is pushed out this is a full time role but can also be done on a contract basis job description work with the product and leadership teams to define our enterprise data model and architectureprovide architecture guidance best practices and detailed design to development team for data integration projects across platformsunderstand how data relates to the current products and operations and the effects that any future changes will have on data throughout the organizationwork with solutions architect and broader data engineering development team to incorporate feedback into data modelswork with internal team to build process in support of data information lifecycle management governance lineage and qualityqualificationsa ba bs degree in data science or equivalent experience 5+ years of experience in conceptual logical physical data modelingexperience with data modeling design patterns 3nf and dimensional modeling building highly scalable and secured solutionsstrong understanding of cloud architecture specifically amazon (ie redshift) as it relates to data processingexperience leading and architecting enterprise wide initiatives specifically system integration data lakes data warehouse etc able to confidently express the benefits and constraints of technology solutions to technology partners stakeholders team members and senior levels of managementunderstanding of pii standards processes and security protocolsfamiliar with data anonymization concepts and technologies preferredbenefits include: health insurance dental 401k life insurance flexible spending account vision health saving account healthcare on-site retiree health and medical pension plan vacation & paid time off sick days paid holidays tuition assistance and many moreequal opportunity statementsomerset global solutions is an equal opportunity employer all applicants will be considered for employment without attention to race color religion sex sexual orientation gender identity national origin veterans or disability status and prohibits workplace discrimination and harassment of any kind multiple bi data engineer roles at principal and or senior levelfocus mainlyon data engineering data science machine learning implementation for businessprojects data replication for bi and other app teams data warehousing for reportingas well adhoc data analysis and bi and data science platform tech stack platformis netezza informatica v9 5 1 erwin v9 5 python v3 aws redshift s3 kinesis job summarywe are seeking an experienced product manager product owner to join our data platform team supporting the integration management security discovery and analysis of multiple data types into services and products for cancer research in this position you will be responsible for driving and defining the launch of new data platform services to support a variety of fred hutch cancer research endeavors you should have a solid grasp on software as a service clinical genomic or specimen management or analysis and a passion for data in order to deliver high impact infrastructure products to accelerate cancer research responsibilities:work with stakeholders and team members to develop high quality user stories and mock-ups for engineering scrum teams create and maintain backlog and roadmap for product update and communicate changes regularly to team and to stakeholders regularly communicate to team and organization the value that new data management technologies bring to users and to the fred hutch mission work with end-users to manage expectations provide training answer questions triage support tickets and document and prioritize new feature requests be part of team ceremonies understand progress and challenges provide input from the product perspective and help remove any obstacles maintain clarity and momentum in the process author product documents samples demos and other user readiness materials regularly present feature demos to users to solicit feedback identify user and sponsor personas user needs and work with ux ui team to develop product interaction and look feel create buy-in for product portfolio vision internally and externally conduct interviews coordinate focus groups and attend steering committees etc with scientific or technical thought leaders to identify or refine opportunities stay up-to-date with the evolving scientific landscape in commercial and academic domains inspire teams set measurable goals and facilitate urgency and creativity keep the teams focused on creating value for our researchers and releasing impactful features for each project iteration define what success means develop metrics analyze impact and user feedback and communicate the outcomes learnings and next steps to the broader organization act as a thought leader and expert on data management and data integration technologies complete strategic plans for data management based on fred hutch strategic planqualifications:strong desire to help scientists see and understand data work on some of the toughest challenges in the data landscape to help researchers connect clean and prepare data for analysis at scale to lead to new discoveries passion for leading a team that will play a significant role in data platform innovation and expanding offerings to support the latest market and technology trends around integration and data management able to build strong interpersonal relationships with product team strategic partners leadership senior management and other stakeholders experience with open source technologies such as kafka spark data science notebooks and streaming data pipelines experience launching software as a service saas 7+ years experience in domina (ehr clinical specimen genomic data) product management business analyst project management similar or master's phd bachelor's degree in life sciences computer science other relevant field foreign degree equivalent or relevant domain experience shipt has a wide variety of data partners and as we continue to sign more retailer and vendor partnerships in a national rollout we’re looking to grow the integrations and etl group within data engineering data engineering at shipt primarily focuses on retailer catalog and general product data for e-commerce purposes you’ll focus on data integrations and developing etl processes that ingest clean and normalize a variety of data sources into valuable data sets what you'll do:develop and maintain pipelines responsible for ingesting large amounts of data from various sources help evolve our data model for new retailers and new retail verticals work with the catalog team to improve product data quality and fidelity engage with the shipt partner success team and external partners to launch new retailers and data sources be a part of the technical design review process to build scalable processes collaborate with other teams across the organization (e g data science) to enable the better use and understanding of data what we're looking for:2+ years of etl experience proficiency in python is required (this is our primary etl language) proficiency in sql is required (we use postgresql and redshift) a keen attention to detail experience with queues and or streams is a plus experience with key-value data stores is a plus (we primarily use redis and dynamodb) a bachelor’s degree in cs information systems a related field or equivalent work experience we are an equal opportunity employer and value diversity at our company we do not discriminate on the basis of race religion color national origin gender sexual orientation age marital status veteran status or disability status descriptionposition summary• very strong engineering skills should have an analytical approach and have good programming skills • provide business insights while leveraging internal tools and systems databases and industry data• minimum of 5+ years’ experience experience in retail business will be a plus • excellent written and verbal communication skills for varied audiences on engineering subject matter• ability to document requirements data lineage subject matter in both business and technical terminology • guide and learn from other team members • demonstrated ability to transform business requirements to code specific analytical reports and tools• this role will involve coding analytical modeling root cause analysis investigation debugging testing and collaboration with the business partners product managers other engineering team must have• strong analytical background• self-starter• must be able to reach out to others and thrive in a fast-paced environment • strong background in transforming big data into business insightstechnical requirements• experience working with large data sets experience working with distributed computing (mapreduce hadoop hive pig apache spark etc ) • knowledge experience on physical design and implementation sql performance optimization on any mpp database preferably teradata• advanced sql • strong hadoop scripting skills to process petabytes of data• experience in unix linux shell scripting or similar programming scripting knowledge• experience in etl processes• real time data ingestion (kafka)nice to have• development experience with java scala flume python• cassandra• automic scheduler• r r studio sas experience a plus• presto• hbase• tableau or similar reporting dash boarding tool• modeling and data science background• retail industry backgroundeducationbs degree in specific technical fields like computer science math statistics preferred analytics engagement manager – big data exl (nasdaq:exls) is a leading operations management and analytics company that helps businesses enhance growth and profitability in the face of relentless competition and continuous disruption using our proprietary award-winning business exlerator framework™ which integrates analytics automation benchmarking bpo consulting industry best practices and technology platforms we look deeper to help companies improve global operations enhance data-driven insights increase customer satisfaction and manage risk and compliance exl serves the insurance healthcare banking and financial services utilities travel transportation and logistics industries headquartered in new york new york exl has more than 24 000 professionals in locations throughout the united states europe asia (primarily india and philippines) latin america australia and south africa exl analytics provides data-driven action-oriented solutions to business problems through statistical data mining cutting edge analytics techniques and a consultative approach leveraging proprietary methodology and best-of-breed technology exl analytics takes an industry-specific approach to transform our clients’ decision making and embed analytics more deeply into their business processes our global footprint of nearly 2 000 data scientists and analysts assist client organizations with complex risk minimization methods advanced marketing pricing and crm strategies internal cost analysis and cost and resource optimization within the organization exl analytics serves the insurance healthcare banking capital markets utilities retail and e-commerce travel transportation and logistics industries please visit www exlservice com for more information about exl analytics primary day-to-day client contacts and often interact with c-level executives is primary client day-to-day contact and ensures client needs are well understood defined and metserves as the primary interface between senior client management and exl senior leadership (vps and svps)interacts regularly with clients to understand business requirements define analytical problems structure and communicate solutions and ensure client satisfaction with strong business driving resultsmanages leads 1-2 engagement projects with different team structures and service delivery models responsible for driving revenue generating from these accountsleads project teams of 3-6 consultants or more and or team lead and analysts in all aspects of project executionproficiency in application and usage of hadoop architecture & development data engineering using kafka sql querying from rdbmss etc for project delivery plays critical role in defining the problem structuring the solution and executing against itclearly defines project deliverables timelines and methodology laying out the project planowns the execution of the project with on time delivery every time ensuring all project goals are metmanages team members including definition of objectives oversight of execution and evaluation of performanceprovides thought leadership and delivers business insights to identify and resolve complex issues critical to their clients' successactively contributes to business development and to attracting retaining developing and motivating a team of diverse and qualified staffmanages communication between senior exl partners and clients to update project progress and solicit feedback on project deliverables4+ years of experience comprising analytics service delivery consulting solution design and client managementexperience in marketing operations and clinical analytics strategy project management cost reduction and business developmentexperience and strong knowledge of the big data environments and applications such asexperience in hadoop architecture & development – cloudera preferredexperience in data engineering – map reduce sqoop hive spark etc experience with streaming tools (kafka preferred)proficiency in writing sql on one or more mainstream rdbmss like teradata oracle etc hands on experience with large data warehousing (hadoop) or business intelligence implementationsexperience with enterprise scheduling tools (ca7 preferred)proficiency with ingestion tools sqoop and or informatica is preferredagile software development environment experiencedemonstrable leadership ability superior problem solving and people management skillsexcellent listening written communication and presentation skills what we offer: exl analytics offers an exciting fast paced and innovative environment which brings together a group of sharp and entrepreneurial professionals who are eager to influence business decisions from your very first day you get an opportunity to work closely with highly experienced world class analytics consultants you can expect to learn many aspects of businesses that our clients engage in you will also learn effective teamwork and time-management skills - key aspects for personal and professional growth analytics requires different skill sets at different levels within the organization at exl analytics we invest heavily in training you in all aspects of analytics as well as in leading analytical tools and techniques we provide guidance coaching to every employee through our mentoring program wherein every junior level employee is assigned a senior level professional as advisors sky is the limit for our team members the unique experiences gathered at exl analytics sets the stage for further growth and development in our company and beyond "eoe minorities females vets disabilities"eeo minorities females vets disabilities join our team and experience workday! it's fun to work in a company where people truly believe in what they're doing at workday we're committed to bringing passion and customer focus to the business of enterprise applications we work hard and we're serious about what we do but we like to have a good time too in fact we run our company with that principle in mind every day: one of our core values is fun job description we are looking for a functional architect data scientist to expand our intelligent data preparation and visualization capabilities in workday prism analytics this role would closely partner with product management to explore prototype and designing smart features using statistical functions machine learning algorithms and other programming techniques you will be focused on building platform capabilities that work not only with workday-delivered data sources but with external data brought in via workday prism analytics these capabilities are targeted at making business users more efficient and confident when performing data preparation and analysis you will be responsible for doing the hands-on design of strategies and algorithms for improving system intelligence this is a data-driven role so conducting experiments and working with usability research to perform studies are critical to the success of this position ultimately you will be interfacing with engineering turn concepts and designs into production code as a functional architect data scientist your position will require that you interact with customers you must be comfortable attending customer meetings and presenting your ideas both internally and externally you must have effective verbal and written communication responsibilities: work with product managers and design to formulate the statistics and ml product vision and design propose statistical machine learning based model methodology or other programming techniques to solve the problem propose accuracy measures and validation criteria for the model conduct experiments and iterate with product management design and customers to come up with a viable solution partner with the engineering team on product execution requirements: **this is not an entry-level position** applied experience and expertise in applied machine learning or applied stats ms phd degree in the fields of computer science applied math statistics operations research computational physics computational biology or other quantitative fields proven track record of analyzing large-scale complex data sets modeling and machine learning algorithms in-depth understanding of statistical analytics techniques and machine learning algorithms mastery of at least one statistical modeling tool such as r matlab sas python (numpy scipy pandas) or mllib business intelligence analytics or etl experience required experience using spark ml stack (dataframe mllib graphx) is preferred curiosity in data and a strong desire to continuously improve fast learner detail oriented and must enjoy fast-paced work environments excellent analytical collaboration verbal and written skills data architect if you’ re a healthcare professional who enjoys bringing new techniques and solutions to this industry then this will be an exciting opportunity for you!ektello is searching for a data architect for one of our top healthcare it business clients in waltham massachusetts our client is a leading healthcare saas product company that brings together thought leadership and new techniques to research solutions focusing on patient engagement medical research and much more responsibilities determine data requirements and structure by analyzing existing business systems and process ensures flexible data models for future business transformation activities maintain database performance create migration plans process flows and dependency diagrams as business needs arise partner with cross functional and 3rd party vendor teams qualifications proven experience as a data architect data scientist or similar role ability to independently create blueprints as well as data models from requirements or independent systems forensics experience with informatica products especially the data management suite experience with cloud hosting environment such as aws or azure familiarity with salesforce saleslogix netsuite or oracle db familiarity with data visualization tools (e g tableau d3 js and r) familiarity with boomi pentaho or informatica etl experience with service oriented architecture (soa) web services enterprise data management information security applications development and cloud-based architectures in-depth understanding of database structure principles knowledge of data mining and segmentation techniques passion for technology and self-education to recommend new technologies and techniques appreciation of agile development environment bachelor' s degree in computer science information science or similar fields position: big data architect (guidewire policycenter)location: madison wistart date: asaptype: c2c contract to hiredata lake design and concepts along with big data experience in hadoop ecosystem ~ hive pig impala or and related technologies sparkmpp shared-nothing database systems nosql systems 5+ years’ experience with hadoop hive pig impala hbase and related technologies 3+ years’ experience with mpp shared-nothing database systems and nosql systemsminimum ten (10) years insurance industry experience with three (3) years of experience with guidewire policycenter proven guidewire business analyst experience must have experience with the following personal lines of insurance – auto property and umbrella & commercial lines of business: general liability workers comp business owners crime cyber risk professional liability commercial auto and commercial propertyreview comprehensiveness of attribution for the insurance data model must have good knowledge of canonical data models and industry standards like accord must have good knowledge of data science and other bi operational insurance use cases must have the ability to define the use cases and tie it to the data layers define the test scenarios and guide testers to write scenarios and test casesfacilitate workshops to gather data and reporting requirements from stakeholders analyze reports interact with designers to understand system limitations identify solutions to business problems specify requirements in enough detail to be successfully implemented identify and communicate solution options across multiple lines of business ability to work in a fast-paced team-oriented environmentability to work with end-users to gather requirements and convert them to working documentsstrong interpersonal skills including a positive solution-oriented attitudemust be passionate flexible and innovative in utilizing the tools their experience and any other resources to effectively deliver to very challenging and always changing business requirements with continuous successmust be able to interface with various solution business areas to understand the requirements and prepare documentation to support developmentresponsibilities:validate the architecture and provide details and shared ownership for the architecture of our next generation data warehousing & big data systems leading and working hands-on towards implementation and delivery to productionhelp lead the charge on a data lake operational data store strategy ensuring rapid delivery while taking responsibility for applying standards principles theories and conceptsresponsible for design and delivery of data models which power bi initiatives dashboards syndicated reporting and ad-hoc data exploratory canvaseswork with data architects on the logical data models and physical database designs optimized for performance availability and reliabilitytuning and optimization of backend and frontend data operationsserve as a query tuning and optimization technical expert providing feedback to teamscripting and automation to support development qa and production database environments and deployments to productionproactively helps to resolve difficult technical issuesprovide technical knowledge to teams during project discovery and architecture phasesassess new initiatives to determine the work effort and estimate the necessary time-to-completiondocument new development procedures or test plans as neededparticipate in data builds and deployment effortsparticipate in projects through various phasesperforms other related duties as assignedpartner with the business units to develop effective solutions that solve business challengescompetencies:self-starter who gets results with minimal support and direction in a fast-paced environment takes initiative; challenges the status quo to drive change learns quickly; takes smart risks to experiment and learn works well with others; builds trust and maintains credibility planful: identifies and confirms key requirements in dynamic environments; anticipates tasks and contingencies communicates effectively; productive communication with clients and all key stakeholders communication in both verbal and written communication perseverance: stays the course despite challenges & setbacks works well under pressure strong analytical skills; able to apply inductive and deductive thinking to generate solutions for complex problems saama values:integrity: we do the ‘right’ things innovation: we ‘change the game ’transparency: we communicate openlycollaboration: we work as one teamproblem-solving: we solve core complex business challengesenjoy & celebrate: we have fun #li-ap1 company descriptionelement 84 is a team of talented individuals that are leading innovation in several different areas ranging from growing video streaming services to nasa projects that interact with petabytes of earth data the team is continuing to grow as we are taking on some of the toughest challenges in the industry every member of element 84 is invested in one another to deliver reliable products that benefit our world job descriptionelement 84 is looking for a passionate data analyst that has experience with subscription based video services the ideal candidate will be able to design data models that produce graphs and reports to help drive business decisions as well as provide insights into user behavior and trends they will also be able to identify critical metrics and kpis that help drive the success of a subscription video service primary responsibilitiesinterpret data analyze results using statistical techniques and providing ongoing reportsprovide ideas and suggestions on ways to improve kpis for a subscription video service based on datadesign and assemble large complex datasets that meet functional non-functional business requirementswork with the latest technologies in big data to deliver highly scalable solutions such as big query redshift hadoop etc utilize business intelligence tools to deliver data through visualizations of business performance metrics that can be analyzed by various stakeholders discover new ways to combine complex datasets that answer questions for the businessassist in the continual growth of the product by coming up with innovative ways to view and interpret datacontribute to process-improvement initiatives in an effort to maximize velocity for the data science teamfoster teamwork and a spirit of collaboration among team members and business partners while acting as a driver for strategy and opportunities informed by the business data remain current with new technologies in order to drive innovation and continue to grow the product and teamcontribute to the team’s success through analysis solution design development and deliveryexperience implementing data systems using some of the following technologies concepts:tableau domo sisense or similar business intelligence platformsgoogle analytics data studioapache spark hadoop or other big data technologiesqualifications3+ years of proven work experience as a data analyst or business data analyst for subscription video servicesunderstanding of kpis that drive subscription servicesworking knowledge in programming languages such as python sql r and or javademonstrate skills or familiarity with big data technologies preferably aws azure and or google cloud servicesability to work collaboratively in cross-functional teamsability to be self-directedstrong analytic skills related to working with unstructured datasetsadditional informationpaying attention to who we are -- as a company as people as family members friends and colleagues is probably the biggest part of who we are there are lots of ways to run a company and you have probably experienced more than your share for us it's wanting to come to work being around people we enjoy taking on big things with people you trust and sharing our achievements as a team you’ll get credit when things go right and we’ll have your back when things go wrong we only take on work that is challenging and right for us there are projects we will turn down and the team has a say we may be a small company but we have big company benefits meant to support the idea that we're here for the long term and that happiness comes from much more than just where you work including 401k health & dental insurance life & disability insurance flexible schedule cell phone stipend flexible spending accounts for transportation and dependent care and a generous pto policy plus a little happiness where you work too -- cinema displays an award-winning candy bowl amazing restaurants and a brand new office we have an extraordinary retention rate because we only hire extraordinary people we hope that’s you reports to: business intelligence analyst job summary: responsible for managing and coordinating data as well as designing and building relational databases for data storage and processing analyzes company needs and develops recommendations for warehouse implementation data acquisition and access and data archiving and recovery builds data models and defines the structure attributes and nomenclature of data elements sets policies and procedures for maintaining data integrity and manages internal databases will develop ongoing business reports exceptions reports and ad hoc reports provide support for analytical review interpretation and communication of insights as needed will also work with the team to identify data requirements and develop output for desired information requirements• proven work experience as a database analyst category analyst data scientist or other related position • experience in sql or other code writing• spss analytic or modeler sas or other statistical analysis software• analytical and programming aptitude• managing large data sets• ability to use complex computer programs to mine data sources• comfortable with collecting analyzing and interpreting big data• strong organization skillsprimary objectives• manage cleanse and harmonize data from multiple sources into a database in order to develop structured files for detailed analysis• preparation of streamlined files for product category and business insight & analytic reviews• develop and support business reporting exception reporting and ad hoc reports• support data visualization efforts to maximize effectiveness of information for users• provide support for business analytics sales analysis industry trending and other key insights• support implementation of analytics software• continued process improvement through the use and output of data• support business intelligence tools databases dashboards systems and data collection methodseducation experience• bachelor’s degree in business computer science statistics computer engineering or related field• 1-3 years’ experience as a database analyst category analyst systems engineer or related work experience with database management• must be good with managing and harmonizing large data sets• strong proficiency with statistical analysis software various computer programming languages and data warehouse systems• working knowledge of erp systems• proficient in microsoft suite including access• understanding of mass merchandising pos portals a plus about us:staff it enterprises leverages over 15 years of experience in excellence to empower our clients and candidates we navigate through the ever-evolving world that is information technology expertly and comprehensively by creating innovative talent acquisition solutions our solutions benefit every single individual organization and partner involved from inception to reality for our clients we offer full-service solutions that enable them to catapult their growth and expand their organizations to their full potential while we offer our candidates a unique relationship with their recruiter that will empower their career and experiences location: us-tx-dallas us-ca-san francisco us-pa-philadelphia us-co-denver or us-ga-atlantatravel:regular and expeditious travel throughout the united states and occasionally overseas is required to meet client needs and timetablesavailable to be stationed at and work from an out-of-town client site for an extended period of timejob summary:perform architecture design data modeling and implementation of big data platform and analytic applications for hitachi consulting's clientsanalyze latest big data analytic technologies and their innovative applications in both business intelligence analysis and new service offerings; bring these insights and best practices to hitachi consulting's insights and analytics practicestand up and expand data as service collaboration with partners in us and other international marketsapply deep learning capability to improve understanding of user behavior and datadevelop highly scalable and extensible big data platforms which enable collection storage modeling and analysis of massive data setsqualifications requirements:over 8 years of engineering and or software development experience hands-on experience in apache big data components frameworksdeep technical expertise in spark hive impala kuduover 3 years' experience with python(pyspark) and scala strong devops skills and ability to guide the client in the physical deployment of clusters should be expert in maven github and jenkins strong data modeling the candidate must have end to end experience in at least two big data data warehouse projects expertise in real-time data streaming using kafka and spark streaming should be able to deploy and monitor kafka clusters strong expertise in developing the data as a service platform should have significant experience in consuming data from third party data api's 3 years of experience in working with and developing datasets for tableau developers and data scientists experience in architecture and implementation of large and highly complex projectsdeep understanding of cloud computing infrastructure and platformshistory of working successfully with cross-functional engineering teamsdemonstrated ability to communicate highly technical concepts in business terms and articulate business value of adopting big data technologiesbachelor's degreepreferred requirements:experience in business domains like b2b manufacturing communications finance and supply chainexperience with the big data platform and application development ciber global is seeking a data scientist architect in nashville tn!job description: reviews ongoing activities related to the development implementation maintenance and adherence to the states policies and procedures covering that privacy of and accessto state information in compliance with federal and state laws (i e hipaa fti pci ssa ferpa cjis fisma) reviews all system-related information security plans throughout the network to ensure alignment between security and privacy practices ensures that data security practices in particular logging monitoring and auditing practices are not in conflict with privacy requirements develops corrective action plans for identified privacy compliance issues develops an audit and compliance program to assure adherence to established standards conducts privacy risk and impact assessments monitors the status and effectiveness of privacy controls across service offerings ensuring that privacy-related key risk indicators are effectively monitored to prevent an unacceptable impact on business objectives and reputation liaises with the data architects database administrators and third parties to ensure that sensitive data is stored and monitored appropriately and to anticipate potential privac problems embedded in the use of emerging technologies (e g cloud artificial intelligence machine learning) responds to regulatory authorities in the event of any data breach performs regular data discovery exercises to ensure all sensitive data is identified and monitored performs initial and periodic information privacy risk assessments and conducts related ongoing compliance monitoring activities facilitate root cause analysis and correction of operational processes assists directs delivers or ensures delivery of initial and privacy training and orientation to all employees and professional staff in regards to data privacy skills experience: graduation from an accredited college or university with a bachelor's degree and five years of experience in information technology information privacy laws access release of information and release control technologies extensive data analysis and data privacy experience experience in sql oracle & cobol environments experience compiling organizing and strategizing enterprise-wide data plans strong emphasis on reporting and normalization across many different platforms from an architectural perspective sciences and informatics background this position requires thesuccessful completion of a background investigation and or drug screen ciber global is an equalopportunity employer minorities females gender identity sexualorientation protected veterans individuals with disabilities play keep discovering!clickhere grow ciber global is an itconsulting company who partners with organizations to develop technologystrategies and solutions that deliver tangible business value founded in 1974 ciber is an htc global services company for more information visitwww ciber com architect - provided by dice architect title: data engineer- hadoop teradata location: sunnyvale : californiaduration: 6 months skilled data engineer to join our analytics team the ideal candidate has an eye for building and optimizing data systems and will work closely with our systems architects data scientists and analysts to help direct the flow of data within the pipeline and ensure consistency of data delivery and utilization across multiple projects responsibilities: â¿¢ work closely with other data and analytics team members to optimize the companyâ¿¿s data systems and pipeline architecture â¿¢ design and build the infrastructure for data extraction preparation and loading of data from a variety of sources using technology such as sql and aws â¿¢ build data and analytics tools that will offer deeper insight into the pipeline allowing for critical discoveries surrounding key performance indicators and customer activity â¿¢ always angle for greater efficiency across all of our company data systems at pointsource we dream and build digital journeys that matter to millions of users we create a deeper relationship with the users by delivering memorable experiences that are personalized we do that by leveraging engineering design and innovation with our own industry-leading practices we want you to join us in creating these journeys for the biggest clients in tech retail travel banking ecommerce and media revolutionizing and growing their core businesses while helping them (and yourself!) stay relevant we are currently looking for a big data architect with the skills below! strategic leadership with constantly developing big data technologies a big data architect (bda) needs to execute his her strategic skills to set the course for organizational strategy development and required modifications results driven mindset the bda must have the ability to easily adjust to the ever changing environment to ensure the strategy is on its right course and is meeting its targets of profitability organizational relationship building a bda must have the ability to fully master navigation of internal politics within the organization requirements10+ years of professional experience bachelor's in computer science (or related major) or equivalent experience passion for technology we work on cutting edge digital technologiestravel as necessary up to 25% for engagementspreferred+3 years of experience as dba in oracle or postgresql (specially if working on a greenplum platform) or +5 years of experience as python or java developer for frameworksbusiness analysisknowledge the business model and client consumer cycle identify enhancement opportunities in the production cycle analyzing the opportunity to associate public data sources give visibility of resource usage in a segmented way by region help in the development of new business opportunities by analyzing brand presence in the digital market help measure web content audience reach gap analysisperform deep analysis of business needs and technical context to use as input to strategize on the big data roadmap to deploy a single data reservoir in the organizationperform architectural and functional assessment of the existing context to suggest a technology stack that adapts best to the detected functional and non-functional requirements for the big data implementation plandata reservoirhands on experience in data modeling in the big data field dealing with big volumes of datahands on experience with hortonworks cloudera mapr or apache hadoop distributionexperience in nosql databases as apache hbase mongodb or cassandraability to define relational database as well as nosql schemasdata ingestion cleansing validation and catalogvalidates the availability of raw dataevaluates feasibility of transformation to destination data sourcedefines required etlsmakes refined data available in structured unstructured repositorieshands on experience using flume sqoop for data ingestionability to develop etl using different tools like apache pig apache hive mapreduce or pentaho dihands on experience in distributed memory processing framework as storm kafka or shark spark impala for near real time processing and queryinghands on experience with jobs schedulers such as oozie or azkabanexperience documenting apis exposing the aggregated data metadata and data catalogdata miningwork together with data scientists in the understanding of the data model and documenting how to access data sources in the most efficient way to perform discovery phases on the platform to detect patterns and generate predictive analyticssuggest enhancements regarding data refinement and data completiondesign data replication for disaster recovery strategydeployment and disaster recoveryimplements and manages the big data solution on top of the sized hardwarecluster configuration and tuningexperience on aws or google cloudbenefitsmedical fsa dental vision short & long term disability401(k) w matchbring your own device program - we pay for the device you choose to work withvacation bonuswireless reimbursementquarterly bonusvacation holiday break & unlimited sick timevolunteer day to donate your time to a charity you chooseon-site yogabeer o'clockflexible work hourstraining & professional developmenthealthy snack & local roastery coffee department description the information technology department is responsible for establishing monitoring and maintaining information technology systems and services as well as ensuring that all it initiatives support hsf’s goal of being the leading and most reputable non-profit organization in the us role descriptionthe data architect will report to our chief technology officer the data architect will be responsible for the cleaning analysis charting reporting and management of our enterprise data this position combines hands on execution of analytical tasks architectural support for projects that alter the data landscape and high level strategic decision-making around the evolution of our data architecture job requirementsgeneral duties - essential operations o in partnership with users define a future state data architecture to meet current and future organizational needs around the strategic and tactical uses of datalogical and conceptual data models and data flowchartsmaster data definitions and ownershipdata cleansing practices and toolsdata replication and synchronization necessities tools schedules and jobs procedurestrain empower and enable users to self-serve for most data access analysis and reporting needso work with the leadership team to define a roadmap and drive the organization to the future state data architecture using program management disciplinemanage offshore development team maintaining compliance with architectural standardsmanage relationships with users and meet evolving data access needsmanage databases and interfaces for optimal performanceexecute projects that move the organization along the roadmapo analyze structural data requirements for new software and applicationso define and manage security and backup procedureso coordinate with other departments to identify future needs and requirementso develop complex sql queries to extract and transform unstructured datao improve and refine data editing and standardization processo assist users in visualization of data in order to explain persuade and tell a storyo build models leveraging basic data science techniques to help automate business processesadditional responsibilitieso migrate data from legacy systems to new solutionso ongoing report generation analysis and review of critical key performance indicatorso gather transform explore and analyze data across different sourceso partner with various team members to produce and deliver quality data outputso assist with additional projects and assignments as requestedqualificationseducation experienceo ba bs and or mba from an accredited university with emphasis in computer science mathematics statistics business analysis management information system computer information systems or other relevant discipline(s)o proven statistical analysis skillso strong and extensive proficiency in excelo experience in building new data warehousing and etl processeso extensive experience in writing sql queries stored procedures and scriptso experience transforming the data architecture of an organizationo working knowledge of data science fundamentalslinear regressionstatistical analysispredictive analyticsfit with professional environmento ability to prioritize handle multiple tasks projects juggle changing deadlines provide structure to team and meet tight deadlineso capability to frame unstructured complex analytical problemso ability to solve multiple problems at the same timeo ability to work in a start-up type of environment where information is not always structured and or routinely availableo ability to work in a small team environment and with all levels of staff and managemento ability to interact professionally with a diverse group of fellow team members executives managers and subject matter expertsprofessional skillso excellent analytical skillso effective and excellent communication: written verbal and interpersonal including presentation skillso outstanding organization skills strong listening skills and extreme attention to detailo able to maintain confidentiality of work-related information and materialso must be enthusiastic self-motivated and possess the ability to execute with minimal directiono meticulous attention to detail with an overall passion for continuous improvemento innovative and creative with a logical and methodical approach to problem solvingadditional requirements o experience as a cross-functional team membero flexible to work weekends and or extended work days as requiredother criteriathe physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job this position requires:constant sitting or standing some walking around andoccasionally lifting no more than 10 lbs to apply: email resume & cover letter the data architect will provide guidance and leadership to the organization as the technology and business landscape evolves to meet the demands of a changing marketplace this person must have knowledge of all components of the warehousing architecture: technical infrastructure data etl bi and metadata management the data architect will ensure that appropriate standards are defined and followed and the individual components of the architecture are compatible with broader architectural goals partner with the business and technology functions to develop a clear understanding of business development and operationsestablish and maintain the data architecture and roadmap for the organizationset and enforce standards and overall architecture for the enterprise information access architecturemonitor changes occurring in the data technology market and assist in establishing the organization warehousing strategy and selection of strategic warehousing tools and techniqueslead efforts to build an integrated data model for the analytical data environmenteducate the project teams on the standards and architecture of each component of the data architectureensure compatibility of the different components of the data architectureensure proper selection of appropriate software tools and development techniques for the different components of the data architecturementor staff and provide educational and training opportunitiesassist in resolving project issuesassist in managing all requests for change in scope or requirementsmonitor changes occurring in the metadata market and assist in establishing the organization’s metadata strategy and selection of strategic metadata and modeling tools and techniquesdesign conceptual and logical data models and flowchartsprovide oversight to data mapping data design and data modeling activities and participate in design code and test reviews to ensure quality solutionsenforce and enhance data analysis and modeling methods and proceduresenforce and enhance data standards and conventionsenforce and enhance metadata management proceduresminimum education requirementsbachelor’s degree in computer science mathematics information systems or related degree preferred or equivalent work experiencespecial knowledge and or skillshealthcare system business and management experienceexcellent written and verbal communication skillsstrong organizational skills and detail orientedability to work well with others within all organizational hierarchiesknowledge of business systems operating systems data modeling data management data quality management and data warehousingknowledge of data acquisition (etl) tools data cleansing tools bi tools master data management tool data modeling tools and data visualization tools (tableau)in depth of knowledge in ms sql server components: sql server ssis ssrs dqswork background experiencea minimum of 8 years of data engineering experiencea minimum of 5 years of data architecture and data modeling experienceexperience working in large data volume environmentsexperience working in a regulated industryexperience working as a data scientist or data analyst is preferredphysical requirementsphysical health sufficient to meet the ergonomic standards and demands of the position about usvirginia premier is a managed care organization which began as a full-service medicaid mco in 1995 partnered with vcu medical systems we strive to meet the needs of the underserved and vulnerable populations in virginia by delivering quality driven culturally sensitive and financially viable medicare and medicaid healthcare programs headquartered in richmond va we also have offices in roanoke tidewater and bristol with additional satellite locations allowing us to serve over 200 000 members across eighty counties throughout virginia we offer competitive salaries and a comprehensive benefits package to include excellent medical dental and vision plans tuition assistance infant-at-work program remote work options and generous vacation and sick leave policies our culture supports an environment where employees can continuously learn and gain professional growth through various development programs education exciting projects and career mobility all qualified applicants will receive consideration for employment without regard to age race color religion sex sexual orientation gender identity national origin disability or protected veteran status eoe our mission is to inspire healthy living within the communities we serve! senior data engineer - python sqllocation: lehi utahsummary:owlet is searching for a senior data engineer to join our platform development team the ideal candidate has several years of experience with python sql and etl and will be responsible for maintaining expanding and optimizing our data pipeline as well as optimizing data flow and collection for our cross functional teams the hire will report directly to the vp of software engineering the ideal candidate is an experienced data pipeline builder and data wrangler who enjoys the challenges of optimizing data systems as well as the thrill of building them from the ground up the senior data engineer will support our software developers business analysts and data scientists with our baby health and sleep data initiatives and will ensure optimal data delivery throughout ongoing projects this is an awesome opportunity for someone to leverage their data skills to help create innovative products that give parents peace of mind by delivering the right infant health data at the right time primary responsibilities:create and maintain optimal data pipeline architectureassemble analyze and transform large complex data sets that meet business requirementsidentify design and implement internal process improvements: automating existing manual processes optimizing data delivery cost and scalability build the etl infrastructure required for optimal extraction transformation and loading of data from a wide variety of data sources using sql and big data technologies build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition and experience operational efficiency and other key business performance metrics as needed work with stakeholders including the executive product data and software teams to assist with data-related technical issues and support their data infrastructure needs be meticulous detailed oriented and absolutely dependablework closely with project managers vp of software engineering and fellow engineers following established quality processesparticipate in regular code reviews required skills and qualifications:advanced working python and sql knowledge and experience working with relational databases query authoring (sql) as well as working familiarity with a variety of non-relational databases experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement strong analytic skills related to working with unstructured datasets build processes supporting data transformation data structures metadata dependency and workload management a successful history of manipulating processing and extracting value from large disconnected datasets b s degree in computer science statistics informatics or information systems5+ years of experience in a data engineer roleexperience with nosql and big dataproficient in java python nodejshave a firm understanding of oopexperience with debugging and being meticulous about qualitymust be comfortable working independently as well as in a teamexperience working with applications developed by another developerexperience with source control preferably gitexcellent written and verbal communications skillsbonus skills and qualifications:experience with gcp (especially dataflow bigtable and bigquery) aws azure firebaseexperience with data interchange formats (i e xml json)candidates will be asked to:demonstrate analytical problem-solving programming skills and competencies with the required skills and qualificationspresent applications and websites that will showcase their abilities perksflex time schedulingcompetitive compensation based on experiencebenefits packageproduct discountsstock optionslunch every fridaybuilding a product that saves lives america’s test kitchen is seeking an etl data manager to join our team reporting to the senior director of technology and analytics and working primarily in support of america's test kitchen customer relationship management (crm) and marketing initiatives the etl manager role will assist the strategic analytics team in managing various data sources and data processes the primary responsibility is to help maintain troubleshoot and modify existing etl processes and test newly developed processes using sql java and talend open studio they will also assist in bringing current process up to standards by creating testing monitoring analyzing and optimizing procedures responsibilitiesdesign extract transform and load (etl) components and process flow work independently in the development testing implementation and maintenance of systems of moderate-to-large size and complexity including audits and troubleshootingmonitor etl processes and infrastructureprovide detailed analysis of problems and working with internal it marketing database agency and third-party vendors recommend solutions and remedycollaborate across multiple teams to recommend develop system solutions to business problems and translate business requirements into technical specifications and applicationsresponsible for making clear and concise presentations to establish consensus for architectural and design decisions among business intelligence data science and marketing areasevaluate feature upgrade change requests and recommend action to internal clients and or strategic analytics team and create clear detailed requests to marketing database agency and or internal teamsdocument database designs solutions configurations and teach mentor other team members and vendors including onboarding and knowledge transfercontinuously improve system and database performance automation operations and processesad hoc sql queries and data exportsother duties as assigned you able to work in a collaborative environment focused on team resultsexperienced and skilled at communicating complex concepts in a straightforward manner to a non-technical audiencestrong initiative with the ability to identify areas of improvement with little directionhave strong interpersonal communication and presentation skillscontinue to learn evaluate and recommend new techniques software or systemsare self-motivated with a demonstrated thoroughness follow-up and attention to detailare not easily discouraged with legacy data structures keep a positive attitude towards business problemsset priorities and makes adjustments to reach goals across multiple projectsmust have an eagerness to learn new skills as projects requireskills and experienceminimum 3 years' previous data management experience3+ years of focused experience in developing data warehouses and etlstrong understanding of data warehousing methodology and etl elt best practiceshands on experience architecting and developing etl in talend open studio environmentunderstanding of etl framework – developing audit balance control validation architecture etc familiar with dbms table design loading and tuning principles and experience in sql and stored proceduresexperience with aws infrastructure and resource allocationexperience with java programming languageability to and comfort in monitoring & reviewing windows performance and event logs to troubleshoot diagnose and remedy server issuesfamiliarity with performance and optimization of virtual environments and provisioningsnowflake experience a plusknowledge of code version management using github a pluspowershell scripting and or perl programming a plus incumbent is responsible for performing tasks related to the expansion and optimization of data architecture to empower business units with ready access to data required for business planning and decision-making the following represents the majority of the duties performed by the position but is not meant to be at all-inclusive nor prevent other duties from being assigned when necessary:supports and works with software developers data architects data analysts and data scientists on transformational initiatives ensuring optimal data delivery architecture is consistent throughout ongoing projects works with internal data and analytics experts on process improvements in order to create greater functionality and efficiency in data systems develops tests maintains and troubleshoots code including database development and etl data migration developmentanalyzes operational data requirements and contributes to information technology and capacity requirements assessmentcollaborates with technical and non-technical partners to solve problems and develop new functionalityparticipates in requirements gathering and analysis meetings with team members stakeholders and internal customerssupports and develops enhancements to existing data warehouse and spotfire processesassesses the validity and business-sense of data warehousing models for end-user consumptionassists power users across business units with creation of dashboards data visualizations and reports in excel or spotfireother tasks as assigned4-year degree in management information systems computer information systems computer science or related field highly preferred; combination of education and experience may be considered in lieu of degreedemonstrated understanding of relational databases (sql) requiredknowledge of sql server reporting services (ssrs) and sql server analysis services (ssas) cubes preferredexperience in sql server integration services (ssis) and other advanced business intelligence concepts related to data extraction transformation loading scheduling etc strongly desiredfamiliarity with kimball approach for data mart development multi-dimensional database structures e g star schemas preferredpreferred experience with analytics and data visualization tools such as spotfire tableau powerbisome experience with r python and or predictive analytics a plusestablished proficiency with microsoft office suite (excel word power point)strong interpersonal and problem solving skills with excellent verbal and written communications skills are requiredmust be detail oriented with the ability to multi-task with attention to accuracymust be team player that possess the ability to adapt to a changing and fast paced work environmentcimarex energy co is an equal opportunity employer applicants and employees are considered for positions and are evaluated without regard to mental or physical disability race color religion gender national origin age genetic information military or veteran status sexual orientation marital status or any other protected federal state province or local status unrelated to the performance of the work involved we are currently seeking a motivated career and customer oriented database architect data scientist to join our team in springfield va to begin an exciting and challenging career with unisys federal systems key responsibilitiesdevelop an understanding of the customer’s data environment through data profiling and statistical analyses execute complex sql queries of large oracle table(s) efficiently (note: advanced command of sql is important – beyond just simple proc sql commands in sas to include perhaps something like toad or oracle sql developer ) obtain scrub explore model and interpret data currently stored in oracle databases – using sql and other data mining tools techniquesprovide accuracy and biometric sample quality based on machine learning and statistical analyses required experience and skills:master's degree and minimum 15 years of experience or equivalent experience with developing predictive models on accuracy using large data sets for high transactional volume environmentexperience with evaluating and measuring performance of modelsshould have a firm understanding of common statistical modeling and techniques (e g linear regression logistic regression decision trees etc ) perform the above tasks using python or r possess the ability to perform with little direct supervision as a self-starter demonstrate excellent troubleshooting skills be a self-motivated creative and inquisitive problem solver with a strong work ethic be (or rapidly become) a thought-leader in the area of analytics data science with respect to entity resolution as it pertains to the customer’s mission be able to generate written documentation of all work performed have a customer-focused demeanor have effective oral and written communication skills ability to understand and analyze data models – how the data is stored in relational databasesability to understand system integration aspects of integrating model input and output in transactional systems to help real time decision makinggood understanding software application architecture and develop integration approaches for predictive modelsprior experience in working in a mission oriented environment a plusexperience with statistics modeling and machine learning techniques a plusability to work effectively within a team environmentability to work effectively in a high energy and a very rapid dynamic environmentpreferred experiences and skillsconceptual understanding of – and or prior experiences related to – calculating far fmr frr fnmr tar tnr and false alarm rate conceptual understanding of – and or prior experiences related to – data profiling fuzzy matching entity resolution and signal detection theory (specifically with respect to sd theory: designing and improving upon systems that monitor minimize and balance false positive and false negative outcomes) conceptual understanding of – and or prior experiences related to biometric performance using roc det cmc curves identification and detection rate curves and so forth conceptual understanding and knowledge of iso iec 19795-1:2006 (or iso iec np 19795-1) advanced – master’s degree computer science or applied mathematics or computational statistics preferredagilecustomer requirements · clearance – must pass cbp bi and must have a dod secret or be eligible to obtain one· citizenship – must be a us citizen· location: kingstowne va (alexandria)do you have what it takes to be mission critical? your skills and experience could be mission critical for our unisys team supporting the federal government in their mission to protect and defend our nation and transform the way government agencies manage information and improve responsiveness to their customers as a member of our diverse team you’ll gain valuable career-enhancing experience as we support the design development testing implementation training and maintenance of our federal government’s critical systems apply today to become mission critical and help our nation meet the growing need for it security improved infrastructure big data and advanced analytics unisys is a global information technology company that solves complex it challenges at the intersection of modern and mission critical we work with many of the world's largest companies and government organizations to secure and keep their mission-critical operations running at peak performance; streamline and transform their data centers; enhance support to their end users and constituents; and modernize their enterprise applications we do this while protecting and building on their legacy it investments our offerings include outsourcing and managed services systems integration and consulting services high-end server technology cybersecurity and cloud management software and maintenance and support services unisys has more than 23 000 employees serving clients around the world unisys offers a very competitive benefits package including health insurance coverage from first day of employment a 401k with an immediately vested company match vacation and educational benefits to learn more about unisys visit us at www unisys com unisys is an equal opportunity employer (eoe) - minorities females disabled persons and veterans #fed# contract-to-hire opportunity for mdm developer for the state of tn in nashville tn!there are two roles on the team one data scientist and one mdm that will work closely together to work enterprise-wide with all departments at sotn to create strategy architecture and planning for all database activities these folks will want to have experience in multiple environments (sql oracle cobal etc) to have success as all are a part of the environment at sotn knowledge services established in 1994 and headquartered in indianapolis in is a certified woman-owned (wbe) professional services organization with over 1500 employees located in offices throughout north america founded by julie bielawski ceo guidesoft inc dba knowledge services is an industry leader in managed service programs (msp) employer of record payrolling services national recruitment and staffing services we provide outstanding services to major organizations in various industries including; it healthcare entertainment media federal and state governments public utilities telecom manufacturing and more as such knowledge services is committed to providing opportunities for growth – in our company in each team member and in our relationships we believe titles do not define a person but provide a framework to each person’s endless potential our focus on improving our team product and processes drive us every day we are guided by our four pillars that set the foundation of who we are and how we conduct business: knowledge integrity innovation and service knowledge services has benefit offerings to include the following! medical dental and vision coveragevoluntary life and ad&d coveragepet insuranceticket and event discounts!the above are available provided contractors meet eligibility requirementsmdm developer for the center of excellence for data at sts you will provide expert advice counsel and technical expertise to ensure successful deployment of best-in-class master data management (mdm) solution duties include:defines the overall mdm solution architecture and set technical direction defines component architecture and reviews detailed designs for accuracy and overall compliance to defined architecture;develops an understanding of sts’s needs and translates those needs into technological systems design specifications and solutions develop standards best practices and reference designs for the implementation of mdm solutions partner with other departments to analyze design develop and implement mdm solutions that will optimize business outcomes work closely with ba’s and architects to implement best-of-class mdm solutionsprovide technical leadership for the development of data models data mappings data validation rules match and merge rules design and develop customized mdm services design and develop necessary batch interfaces to and from the mdm hubensure the design meets the business service level agreements (slas) for availability and performanceconducts regression testing of new releases of software components within the mdm repository and new releases of mdm softwaredesign conceptual logical and physical data modelling to support enterprise master data management system and the logical workflow demonstrate experience in deterministic and probabilistic matching methodologies demonstrate experience and ability to apply data quality techniques such as match merge profiling score carding data standardization and parsing using enterprise data quality platforms experience with troubleshooting and resolving mdm related issues and provide proven solutions aligning with master data strategies and best practices experience with citizen data management terminology and citizen-related terminologies including master citizen and person index possess good experience in providing architecture oversight as well as in-depth experience in building and managing enterprise application stack for a master data management mdm platform provide technical leadership and support for master data management (mdm) platform including solution architecture inbound outbound data integration (etl) data quality (dq) and maintenance tuning of match rules and expectations advise and provide support to data stewards other system owners and technical personnel in master data management concepts and technical matters collaborate with source systems data stewards and technical personnel for data governance and to resolve any data quality or technical issues related to data ingestion ability to design develop and translate business requirements and processes for data matching and merging rules survivorship criteria and data stewardship workflows using master data management solutions sql server 2016 or highersql server mds servicessql server machine learning (r)ssis and etl packagessolr and zookeeperjson xml tsqlphysical requirements:job frequently requires sitting handling objects with hands job occasionally requires standing walking reaching talking hearing and lifting up to 25 pounds vision requirements: ability to see information in print and or electronically we are an equal opportunity employer we do not discriminate on the basis of race religion color sex age national origin or disability description: - define models and prove it with sufficient test data - work creatively and analytically in a problem-solving environment - benchmark systems analyze system bottlenecks and propose solutions to eliminate them - clearly articulate pros and cons of various techiques and machine learning models - document use cases solutions and recommendations - possess excellent written and verbal communication skills - perform detailed analysis of business problems and provide technical solutions - work in a fast-paced agile development environment - 4yr bachelor's degree or master's in computer science preferably in data science - strong experience working with spark and scala - strong experience working with hortonworks hdp or cloudera or equivalent hadoop distribution - proven track record of researching new ways to model financial market trends and to provide business insights - knowledge of machine learning and statistical modeling with rstudio - strong background in financial systems and business processes - business analytics experience focused on corporate finance is a huge plus - background knowledge of unix programming oracle sql is a huge plus - experienced in designing solutions and implementing it projects keyw corporation is currently seeking a senior database engineer ii with a current ts sci clearance in the dulles va area the database engineer provides technical expertise for database design development implementation information storage and retrieval data flow and analysis develops relational and or object-oriented databases database parser software and database loading software projects long-range requirements for database administration and design responsible for developing a database structure that fits into the overall architecture of the system under development and has to make trades among data volumes number of users logical and physical distribution response times retention rules security and domain controls the database engineer works primarily at the front end of the lifecycle-requirements through system acceptance testing and initial operational capability (ioc) develops requirements from a project’s inception to its conclusion for a particular business and information technology (it) subject matter area (i e simple to complex systems) assist with recommendations for and analysis and evaluation of systems improvements optimization development and or maintenance efforts translates a set of requirements and data into a usable document by creating or recreating ad hoc queries scripts and macros; updates existing queries creates new ones to manipulate data into a master file; and builds complex systems using queries tables open database connectivity and database storage and retrieval using cloud methodologies leads development of databases database parser software and database loading software leads development of database structures that fit into the overall architecture of the system under development develops requirement recommendations from a project’s inception to its conclusion for a particular business and it subject matter area (i e simple to complex systems)required skills:active top secret sensitive compartmented information (ts sci) security clearance required four (4) or more years of database engineering experience required [a master’s degree in a related discipline may substitute for two (2) years of experience]bachelor’s degree in computer science mathematics statistics or related field [six (6) years of experience (for a total of ten (10) or more years) may be substituted for a degree ]apache hadoop postgressql mysql or vmware or oracle dbms knowledgepossess knowledge of sql server and its tools including the facets of successfully administering a wide range of simple to highly complex environments experience with data and schema design and engineeringdemonstrated practical experience with data migration from legacy systems to central repositoriesindustry standard exchange schema implementation experience (e g cybox or capec)be able to evaluate and install new software releases patches and system upgrades knowledge and understanding of all aspects of database tuning: software configuration memory usage data access data manipulation sql and physical storage experience supporting technology strategy roadmap experience with development and execution of database security policies procedures and auditing-experience with database authentication methods authorization methods and data encryption techniques possess good communication skills both oral and written must work well in a team environment as well as independently must exhibit good time management skills independent decision making capability; focus on customer service ability to work with the other technical members of the team to administer and support the overall database and applications environment desired skills:experience database engineering support to dhs dod or intelligence customersdata scientist skills and experienceunderstanding of certification and accreditation (nist 800-53) processes as they apply to database technologiesoperating system and hardware platform knowledge preferred experience working with large unstructured data setsexperience with map reduce technologiesexperience with process development and deploymenttrained in six sigma methodologyitil knowledge and certificationdesired certifications:cloudera certified professional (ccp): data scientistccdh: cloudera certified developer for apache hadoopccah: cloudera certified administrator for apache hadoopccshb: cloudera certified specialist in apache hbasecsslp certified secure software lifecycle professionaldatabase specific certifications as appropriate to positiondodi 8570 1 compliance at iat level i certification highly desired clearance requirement: this position requires a top secret sci security clearancekeyw is an eeo employer we are committed to providing fair and equal employment consideration regardless of race color religion national origin gender sexual orientation age marital status or disability how to apply?please click apply on the right keyw is a pure-play national security solutions provider for the intelligence cyber and counterterrorism communities' toughest challenges we support the collection processing analysis and dissemination of information across the full spectrum of their missions we employ and challenge more than 2 000 of the most talented professionals in the industry with solving such complex problems as preventing cyber threats transforming data into intelligence and combating global terrorism keyw together with its direct and indirect subsidiaries encourages and actively supports a policy of equal employment opportunity and commits to provide equal opportunity to each individual regardless of race color religion gender sexual orientation age national origin or ancestry marital status veteran status disability or any other classifications protected by federal state or local law in fact we foster an environment that promotes diversity balance and fun—because we believe in the importance of having a workplace as unique as the challenges we solve ts sci the senior solutions architect will be responsible for validating productionalizing and integrating proven statistical models into the existing software infrastructure the individual will work on one or many product teams at a time he she must have very strong communication and analytical skills and an ability to work as part of an agile team (product owner data scientists developers etc ) at the interface with traditional it where he she interacts with data and end product owners to negotiate apis and build data pipelinesresponsibilities:gather requirements build roadmaps & design implement & unit test software solutionswork with product owners to understand and negotiate implementation constraints such as system apiswork with data architecture team and data engineers in the analytics coe to understand and improve inbound data pipelinebuild and rigorously test software infrastructure based on existing data pipeline methods proven out by the data science team and systems requirements jointly defined with product ownerswork closely with both backend developers and it data architecture groupsupport bug fixing and performance analysis along the data pipelinehelp build oracle stored procedures functions packages and unix shell scripts as requiredserve as a strong advocate for a culture of process and data quality across development teamsfollow an agile development methodologytechnical requirements:bs ms degree in computer science engineering statistics information technology or a related subject5-7 years of experience in systems integration & software engineering in python c# c++ java scala php etc 3+ years' experience deploying & integrating stat ml modelshas used sql and similar languages unix shell scriptingexperienced with statistical modelling design patterns building highly scalable and secured analytical solutionsexperienced integrating models built using big data tools like map reduce spark couchdb hive and pig and kafkahas strong analytical and problem-solving skillsscrum master or agile experience preferred cai is eoecomputer aid inc cai is a global it services firm that is currently managing active engagements with over 100 fortune 1000 companies and government agencies around the world cai offerings include balanced outsourcing solutions legacy support application development application knowledge capture service desk desktop services and managed staffing services our unique methodologies and tools enable us to provide our clients with real techniques for increasing productivity profitability and competitiveness headquartered in allentown pa with offices and staff throughout the us canada europe and the asia pacific region cai offers a variety of delivery options including on-site off-site and blended solutions our delivery model allows us to successfully leverage our global staff of over 3 500 technical and managerial professionals cai is an equal opportunity affirmative action employer minorities women veterans and individuals with disabilities will receive consideration and are encouraged to apply about the company:placeiq is a powerful location-based audience and insights platform that organizes a wide variety of consumer activity data around a precise location base map at massive scale placeiq uses its detailed understanding of location and consumer activity to reach a targeted audience and also to derive powerful insights about consumer behavior to inform market and business strategies for national brands the company is headquartered in new york city and has offices in san jose and chicago summary:having already assembled an exceptionally skilled diverse and passionate team of developers and data scientists we are looking for world-class engineers that can (or are willing to learn how to…) do it all from web front ends to regression models classification algorithms complex data visualizations to geospatial clustering a full stack enterprise targeting platform to big data analytics placeiq software engineers live for huge challenges and know how to deliver in a fast paced agile environment our unique culture breeds excellence and embraces creativity as we look to innovate and drive our business forward if you have a passion for imagining and building technology solutions that will make an immediate impact in an untapped space we want to talk to you!the role:as a senior member of our engineering team you will own the design development and integration of the next generation location analytics platform along with the technical chops necessary to build and deploy robust high-performance always available services that will scale to our clients’ needs the ideal candidate will bring the “right” mix of creativity and development discipline to expand the standards of excellence that our engineers and data scientists have already set we work in an agile environment focused on delivering high quality software through quick iterations we don’t always stick to a particular programming language or adapt our requirements to fit within a particular technology stack software development for us focuses on the simple yet elegant solutions to complex problems we rely on the most basic fundamentals of computer science every day while drawing from the latest and greatest breakthroughs in mobile web big data paradigms to build the absolute best in class platform for our clients at placeiq we empower our team members to extend beyond traditional functional boundaries so that we can build game-changing products this means that our engineers will need to put on their product manager hats to deeply understand the mobile ecosystem with an eye towards developing a major disruption our engineers will also have to become social anthropologists analysts to uncover the hidden truths and behavioral patterns from petabytes of geospatial data minimum requirements:ba bs in computer science engineering or related technical field3-5 years of industry experience with java and hadoopfull lifecycle development experience building high performance low latency client-server architecturesstrong database nosql and networking skillsexperience in unix linux environments with bash python ruby scriptingsolid foundation in computer science fundamentals from data structures and algorithms to high-level design patternspreferred qualifications:mastery of a web services stack (any language)strong experience with big data sets and nosql databasesprevious experience managing teams of developersstrong knowledge of performance testing and processing optimization techniquesstrong experience developing complicated algorithmsexperience with functional languages (scala clojure haskell)experience working in a kanban agile group consumertrack is unique in the digital marketing and media industry - we combine marketing digital content and fintech our performance based approach increases brand awareness and generates targeted audience engagement on our internal web properties and partner sites learn more about what we dowe're looking for a data engineer with experience designing developing and supporting enterprise level data warehousing and reporting platforms you will play a vital role in building our data warehouse and developing real time etl processing to support our analytical reporting systems you will also be supporting our data science and data mining projects for on-going growth responsibilities:work closely with product managers engineers and business stakeholders to become a source data expert develop and maintain etl processes to ensure the data quality and consistency define technical requirements and data architecture for the underlying data warehouse collaborate with subject matter experts across different business units to design implement and deliver insightful analytic solutions improve and maintain data access for our bi tools automate data quality monitoring and improve auditing capabilities requirementsbasic qualifications:3+ years experience in relational databases dimensional data modeling etl development and data warehousing proficiency in redshift snowflake or other mpp databases expert understanding of sql and strong sql performance tuning skills experience with scalable architectures and large data processing experience with bi systems (tableau periscope etc) excellent troubleshooting and problem solving skills preferred qualifications:experience with pentaho experience with hadoop and or other tools in the ecosystem exposure to aws ecosystem benefitscompetitive salary with excellent growth opportunity; we pride ourselves in having a team that exudes leadership high initiative creativity and passion awesome medical dental and vision plans with heavy employer contribution paid vacation holidays and sick days company funding for outside classes and conferences to help you improve your skills contribution to student loan debt after the first year of employment 401k- we match 3% of employees salary after the first year of employment in-office gym and weekly fitness and yoga classes fully stocked kitchen with snacks and beverages we are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race color religion sex national origin disability status protected veteran status or any other characteristic protected by law senior data engineer - contract role for 7 monthsdata engineering team | santa clara cayour goal - to improve the education process and better the lives of students -- through data engineeringorganization:data engineering chegg web servicesthe data engineering team is central to chegg - supporting every business line and functional organization at the company it is a team composed of data engineers and architects the team influences strategy and tactics for chegg's data ecosystem students' daily interactions with chegg's physical and digital books subscriptions and learning aids generate terabytes of structured and unstructured data our business data scientists and analytics community use this data in understanding how students are learning from our products and services the data will drive business decisions and future investments the data will transform education - chegg is your sandbox to make this happen as a senior data engineer you will build data processing framework that handles terabytes of chegg's product engagement transactional events you will work with a team of data engineers and build data ecosystem at chegg you will work with data scientists analysts and reporting team that leverage data with scientific and reporting tools such as r spark and tableau you will engage with analysts and leaders to research and develop new data engineering capabilities responsibilities:design and build data frameworks (ingestion processing distribution access) for core data platform build scalable data events processing framework that handles terabytes of data design and build robust and scalable data engineering solutions for structured and unstructured data conducive for business insights reporting and analytics design and implement data models that scale across the enterprise work closely with stakeholders (reporting team analysts and business) and source system domain experts function as data expert on the usage of data at chegg requirements:7+ years working as an engineer in a data-focused team building data processing frameworks and capabilities we are not looking for a traditional data warehouse etl developer extensive experience working with structured and unstructured data programming and scripting languages the assumption is you have experience working with big data technologies (like hadoop hive pig impala presto spark) real-time processing frameworks (like storm aws kinesis) mpp platforms (like redshift greenplum vertica) and relational databases expertise troubleshooting data quality issues analyzing data requirements and utilizing big data systems a strong desire to build data platforms that drive insights from chegg's network of students physical and digital content and learning aids an insatiable appetite to transform education through data about chegg:as the leading student-first connected learning platform chegg's student hub makes higher education more affordable and more accessible all while improving student outcomes chegg is a publicly-held company based in santa clara california and trades on the nyse under the symbol chgg and has 900+ employees with offices in santa clara san francisco portland new york india berlin israel and ukraine and trades on the nyse under the symbol chgg chegg student hub services include chegg study chegg tutoring careers search internship admissions and college admissions life at chegg:https: www youtube com watch?v=yyhnkwid7oochegg benefits: http: www chegg com jobs benefitschegg for good: http: blog chegg com category chegg-for-good glassdoor best places to work 2015: http: bit ly gdbptwchegg ux design page: http: www cheggux com for more information visit www chegg comchegg is an equal opportunity employer 5-7+ years of experience in technology consultinglead teams of 2-10+ technical resourcesconduct hands-on development configuration or system setuppresent technology architecture and solution overviews to executive-level audiencescreate presentation materials with emphasis on simplifying complex ideas into digestible topicsprovide leadership across multiple accounts as well as support sales and business development activitiesapply technical aptitude and design patters across multiple industry verticals and use casesconduct assessments and define enterprise roadmaps for enabling a data-driven analytics organizationexperience with the approaches and challenges for standing up an analytics organization at varying maturitiesexperience designing and implementing enterprise data warehouses (edw) and related technologies (teradata oracle ms etc )experience or technical knowledge of hadoop and related technologies (cloudera hortonworks etc )experience with data integration and streaming tools used for both edw’s and hadoop (informatica spark kafka etc )experience with cloud platforms (aws azure) including readiness provisioning security and governanceexperience or understanding of data science and related technologies (python r sas etc )experience or understanding of artificial intelligence (ai) machine learning (ml) and applied statisticsparticipates in hiring resourcesparticipates in and drives early stage business development and incremental workeffectively deals with a multitude of ambiguity by confidently walking through an undefined situation and drawing out key requirements and political constraintsconducts facilitated working sessions with technical executives and expertstravels to client locationsrequirements:bachelor's degreeus citizen or gc holdersense corp powers insight-driven organizations we accelerate the entire transformation life cycle from strategy through implementation we deliver outcomes by transforming data into insights we unlock value in your most critical interactions with digital transformation the sense corp compasswe may be the only management consulting firm in the country where being brilliant isn’t enough to land you a job sense corp people must be brilliant creative human and fun all at once in other words we hire terrific well-rounded people its one reason clients love working with us and it’s why we enjoy working with each other we may not sound like typical consultants but that’s ok we don’t think like them either about dsc:we're on a mission to build a better bathroom we started with razors way back in 2011 and now millions and millions of members later we've expanded into shave products skin care hair styling with more to come we're always growing and reinventing and we rely on killer talent to help us achieve our goal of owning the bathroom if you're a team-playing innovator you'll fit right in a sense of humor helps too role summary:we are seeking a data architect who is a visionary in defining and managing data architecture with expertise in data modelling data marting etl performance tuning data governance and data security leveraging big data technologies columnar and time series data stores along with traditional rdbms right candidate will be a self-motivated results driven technologist who is passionate about collaboration but can work independently and lead by exampleresponsibilities: define system level architecture and conduct dimension modelling & data martingmentor team members through conceptual and logical modelling and drive physical modelling on the data martsdefine data security protocols and enabling access controlsconduct database performance tuning and architect low latency data systemsextensive experience building master data management strategy in an organization build highly scalable data marts that can be used by dsc globallyresponsible for maintaining data integrity across multiple data martsbuild overall data mart architecture and design and document the data systems ecosystemdata mapping from sources to the data marts and work with peer data engineering teams to pipeline the datadesign and code for highly scalable solutions for data extractions from data lake and transformations jobs for business rules applicationdefine and parallel process the etl jobs towards low latency and highly scalable systemsarchitect detailed design and code for data quality frameworks that can measure and maintain data completeness data integrity and data validity between interfacing systemsdocumenting data mapping and maintain a data dictionary across all dsc enterprise dataowning the kpis to measure the performance of data marts and provide visibility to senior managementdesign for self serve bi platforms and drive higher adoption ratequalifications:master's degree in computer science data science or related majorsminimum 10 years of industry experience overall9+ years of data warehousing and data architecture with 8 + years of data modelling and data processing for large scale near real time big data platforms like redshift hbase druid snowflake 6+ years of architecting end to end self serve bi platforms using bi tools like tableau qlik sense looker or like6+ years of etl knowledge and parallel processing technologies like spark and kafka streaming 10 years of programming experience with either java or python or c c++ in a linux unix environmentminimum 1 year of working knowledge with cloud based solutions hosted on awsconfluence github jira are other tools and technologies preferreddsc culture:we work in an open-air freshly renovated office in the heart of silicon beach as we disrupt industries and unseat corporate giants our plan is to think big but stay small no egos no jerks no prima donnas just awesome folks who live and breathe collaboration and dig the perks like haircuts weekly food trucks and team happy hours and yes we have snacks ------------dollar shave club is an equal opportunity affirmative action employer all qualified applicants will receive consideration for employment without regard to race color religion gender [sexual orientation] national origin disability or protected veteran status reasonable accommodation: dollar shave club provides reasonable accommodation so that qualified applicants with a disability may participate in the selection process please advise us of any accommodations you request in order to express interest in a position by e-mailing:accommodations@dollarshaveclub complease state your request for assistance in your message only reasonable accommodation requests related to applying for a specific position within dollar shave club will be reviewed at the e-mail address supplied thank you for considering a career with dollar shave club#li-cb1 gps staffing is a nationwide preferred provider of staffing and recruitment services gps has over 25 years of dedicated staffing partnerships with organizations ranging from international fortune 500 companies to colorado startups what we're looking for:this role will function as the business intelligence architect in a team of business intelligence & analytics development resources this position is responsible for proactively developing architecture and strategies for optimal analytics reporting and information management solutions we're looking for a bi architect who can design deliver and direct creation of data movement solutions data modeling cleansing conforming tool usage security source code management and analytic self-service they will work closely with the bi product owner business liaisons and the business to obtain a solid understanding of the business needs and long-term objectives and be able to help align the data needs to meet the business growth objectives what you'll do:collaborate across the organization to determine a relevant data collection and aggregation strategy data warehouse and transactional data repository approachlead data architecture definition and work with product owner and technical resources on design approaches and tool usage reinforce need and consistency of and participate in peer reviews around architecture designs code and unit tests provide insights to the business intelligence data strategy to facilitate the use of information throughout the organization architect plans and drives data integration work develop data dictionary and own a data governance program and processleads the data architecture analysis design and implementation and ensures that the delivered product fulfills the requirements utilize agile development practices to deliver incremental business valuemaintain knowledge of current research trends technology and best practices related to data management data warehousing analytics advanced analytics and data science participate in the creation and ongoing refinements of data warehouse and business intelligence standards and best practicesdrive data audits code reviews qa integration and testing activitiesprepares activity and status reports for active projects as requiredbasic requirements:bs in computer science computer engineering or related discipline or equivalent work experienceresearching and overcoming complex and challenging business requirements with creative data and analytic solutions and systemsbuilding out a platform that can help monetize and produce new product opportunities for all of the company brandsinfluence the evolution of edw big data master data management bi strategy and architecturemastery of sql and data modeling techniques and best practices including kimball and inmon approachesexperience and high level of comfort with data analysis data profiling conforming data and data integration solutions experience with data governance as well as master data management approaches and solutions solid experience with the following technologies and platforms:ms sql postgres redshift tableau attunity ssrs ssis ssas powershell etcstrong sql skills using large data sets (multi-terabytes); experience in tuning; problem analysis and resolutionaws experience is a plussuccessful experience training developers on bi tools and solutionsself-starter and results-driven individualwritten and verbal communication skills; excellent presentation skills and ability to work in a team environment7+ years hands on experience within the data warehousing and business intelligence disciplines about us: staff it enterprises leverages over 15 years of experience in excellence to empower our clients and candidates we navigate through the ever-evolving world that is information technology expertly and comprehensively by creating innovative talent acquisition solutions we offer full-service solutions that enable our clients to catapult their growth and maximize their organization's potential job summary: perform architecture design data modeling and implementation of big data platform and analytic applications for clientsanalyze latest big data analytic technologies and their innovative applications in both business intelligence analysis and new service offeringsarchitect and implement complex iot data analytic solutionsstand up and expand data as service collaboration with partners in us and other international marketsdevelop highly scalable and extensible big data platforms which enable collection storage modeling and analysis of massive data sets including those from iot and streaming datadrive architecture engagement models and be an ambassador for partnership with it delivery and external vendors effectively communicate complex technical concepts to non-technical business and executive leaderslead large and varied technical and project teamsassist with scoping pricing architecting and selling large project engagementsqualifications requirements: over 10 years of engineering and or software development experience and demonstrable architecture experience in a large organization experience should contain 5+ years of experience of architecture support combined of these environments: warehouse datamart business intelligence and big data 5+ years of consulting experience desiredhands-on experience in big data components frameworks such as hadoop spark storm hbase hdfs pig hive scala kafka pyscripts unix shell scriptsexperience in architecture and implementation of large and highly complex projectsdeep understanding of cloud computing infrastructure and platformshistory of working successfully with cross-functional engineering teamsexperience in business domains like manufacturing communications finance and supply chaindemonstrated ability to communicate highly technical concepts in business terms and articulate business value of adopting big data technologiescloud platform technologies such as microsoft azure amazon web services and google cloud on premises big data platforms such as cloudera hortonworksbig data analytic frameworks and query tools such as spark storm hive impalastreaming data tools and techniques such as kafka aws kinesis microsoft streaming analyticsetl (extract-transform-load) tools such as pentaho or talend or informatica); also experience with eltcontinuous delivery and deployment using agile methodologies data warehouse and datamart design and implementationnosql environments such as mongodb cassandradata modeling of relational and dimensional databasesmetadata management data lineage data governance especially as related to big datastructured unstructured semi-structured data techniques and processesbachelor's degree in a technical field such as computer science mathematics physics engineering economics or similar from a four-year college or university master's degree or higher preferred travel:regular and expeditious travel throughout the united states and occasionally overseas is required to meet client needs and timetablesavailable to be stationed at and work from an out-of-town client site for an extended period of timestaff it enterprises is an equal opportunity employer committed to providing a non-discriminatory environment for its employees who we arefarfetch is unlike anything in the world of fashion and technology our mission: to revolutionize the way the world shops to do it we need innovators people who challenge convention and dare to dream we’ve gone from a start-up to a billion dollar business but we’re not done yet far from it be bold be brilliant together we can be extraordinarywe have rapidly grown into a truly global company since our launch in 2008 and we’re continuing to grow our family now includes partner boutiques and brands across europe north and south america and asia; we demonstrate our ‘think global’ value in everything we do we are a global team of over 1 500 people and have offices based in london new york l a porto guimaraes lisbon sao paulo shanghai moscow hong kong & tokyo we are a company with an entrepreneurial spirit and innovative culture we are positive passionate and live our values: be human be brilliant todos juntos be revolutionary think global and amaze customers day to day the team:farfetch’s data teams are focused on everything related to data their main purpose is to harness the power of farfetch’s data to deliver insights and reports that support business decisions and also analyze and discover new ways to amaze our customers these teams cover multiple areas related to data such as business intelligence software and data engineering data science and data analytics just as the rest of farfetch data teams are committed to help the company become a leading e-commerce platform as so they are constantly looking for brilliant people who like the challenges that a fast growing data driven company faces in its path to achieve a global market leadership the roleto be developed within our data governance team this position will require good analytical skills with some technical skill exposure tasks include but are not limited to mapping extraction transformation and validation of data from the various data sources being responsible for governance and policies standardizing data naming establishing consistent data definitions and monitoring auditing the overall enterprise data quality what you'll do:monitor and analyze data within the farfetch suite of applications to ensure data quality and reliabilitywork with bi product owner on data solutions requirements improvement and change managementwork with the security team to guarantee data privacyact as “point of contact” for business users for reporting data issues and providing root cause analysiswork with the business owners to document implement and maintain the metadata and business rules analyze changes to databases and the data they contain to verify qualitydevelop and enforce methods and validation mechanisms for ensuring data quality and accuracy through the warehousedeveloping and implementing data metrics to monitor compliance and data qualitybe responsible for maintaining the metadata in the metadata repository pwho you are:a professional with a bachelor’s degree in computer science statistics or related fieldan excellent problem-solver with communication and time management skillsdetail oriented; able to work independently and set prioritiesvery organizeda professional with excellent english language skills both written and spoken a professional with excellent communication and presentation skills both verbal and written able to work across different departments scoping out needs and translating into output that makes sense knowledgeable of excel and sql knowledgeable of key database concepts data models relationships between different types of data etc experienced with master data management (mdm) systems (a plus) we can’t wait to receive your application but before you send it to us here are some helpful tips to make sure your application is as strong as it can be have you set out why this role is a good match for your career aspirations and that you have the skills and experience required? we want you to be as clear about your future ambitions as we are and whilst we encourage people to learn develop and grow you will need to hit the ground running have you checked spelling and grammar? we have high standards and you don’t want to miss out because of something as easily correctable as a typo we are committed to equality of opportunity for all employees applications from individuals are encouraged regardless of age disability sex gender reassignment sexual orientation pregnancy and maternity race religion or belief and marriage and civil partnerships got a taste for something new?we’re grubhub the nation’s leading online and mobile food ordering company since 2004 we’ve been connecting hungry diners to the local restaurants they love we’re moving eating forward with no signs of slowing down with more than 80 000 restaurants and over 14 million diners across 1 600 u s cities and london we’re delivering like never before incredible tech is our bread and butter but amazing people are our secret ingredient rigorously analytical and customer-obsessed our employees develop the fresh ideas and brilliant programs that keep our brands going and growing long story short keeping our people happy challenged and well-fed is priority one interested? let’s talk we’re eager to show you what we bring to the table develop compelling poc’s for data solutions using emerging technologies for real-time and big data ingestion and processingcontribute to designing building and deploying high-performance production platforms infrastructure to support data warehousing real-time etl and and batch big-data processing; help define standards and best practices for enterprise usagedesign build and maintain processes and components of a streaming data etl pipeline to support real-time analytics (from requirements to data transformation data modeling metric definition reporting etc)focus on data quality - detect data analytics quality issues all the way down to root cause and implement fixes and data audits to prevent capture such issuescollaborate with data scientists to design and develop processes to further business unit and company-wide data science initiatives on a common data platformtranslate business analytic needs into enterprise data models and etl processes to populate thembachelors in technology science statistics or math degree preferred but equivalent experience will be ok8-10 years of data engineering experience in traditional data warehousing etl and or big data pipeline & processing environmentsstrong data modeling and sql experience (dimensional star transactional 3nf)strong programming experience with any of: python scala or javaexperience with emerging big data processing technologies (spark storm kafka flume pig hive sqoop hadoop mapreduce etc)experience with columnar storage and massive parallel processing data warehouses (redshift preferred)experience modeling and querying for nosql databases (cassandra preferred hbase acceptable)experience working within the amazon web services (aws) ecosystem (s3 ec2 etc)experience working within agile software engineering methodologiesstrong analysis and communication skills required got these? even betterexperience with one or more etl data integration frameworks (e g talend informatica pentaho etc )exposure to bi analytics platforms (e g tableau microstrategy etc )familiarity with statistical methods and experimentation (a b testing)unlimited paid vacation days choose how your time is spent never go hungry! we provide weekly grubhub seamless credit regular in-office social events including happy hours wine tastings karaoke bingo with prizes and more company-wide initiatives encouraging innovation continuous learning and cross-department connections we deliver favorites every day join us as we move eating forward grubhub is an equal opportunity employer we evaluate qualified applicants without regard to race color religion sex sexual orientation gender identity national origin disability veteran status and other legally protected characteristics the eeo is the law poster is available here:dol poster grubhub is committed to working with and providing reasonable accommodations to individuals with disabilities if you need a reasonable accommodation because of a disability for any part of the employment process please send an e-mail totalentacquisition@grubhub com and let us know the nature of your request and your contact information the minneapolis medical research foundation's (mmrf) chronic disease research group (cdrg - www cdrg org) has an opportunity for an experienced sql databse developer to support their work in their scientific registry of transplant recipients department (srtr - www srtr org) position summary: the sql database developer participates in the design creation maintenance backup recovery replication and performance of on-site databases maintain database software installations upgrades and patching and completes complex database related projects and initiatives additionally the sql database developer enforces security policies and procedures and maintains user access to databases work directly with users to resolve database access and performance issues and serves as the primary liaison to any organizations providing data to be incorporated into the databases the sql database developer will participate in developing ssis data transformation powershell automation data warehousing and innovative visualization dashboards; and in integrating sql data with r coding statistical models essential job functions: support the database routine operations for the program assist in development of improved etl and data mart processing assist in visualization and reporting solutions ensure compliance with federal information systems security requirements implementing and maintaining security requirements as necessary adhere to processes and procedures to support end-users of the data participate in encouraging a supportive friendly collaborative team culture work with hardware and software engineers as necessary to implement required features remain abreast of data science trends and best practices maintain a balanced work personal life interaction employment standards: education experience:any equivalent combination of education and experience that provides the required knowledge and skills is qualifying typical qualifications would be a bachelor of science (b s ) or bachelor of engineering (b e ) in computer science information technology or a related field and 5-8 years of applicable experience proficient in 3gl sql databases such as oracle microsoft sql server or mysql familiar with data modeling and database related tools familiar with sql stored procedure coding techniques understands the concepts of business intelligence data mining understands the power of quality data visualization and dashboards experience with backups restores and recovery including copying databases from production to test servers experience in executing operational automation using scripts and defined work instruction familiarity with executing and maintaining database security requirements familiar with basic statistical concepts preferred experience:some experience with programming languages net c# java javascript powershell sas r htmlexperience in troubleshooting and resolving database performance and design problemsexperience with the microsoft data stack ssms ssas ssis ssrs and visualization toolssome understanding of service broker in sql serverfamiliarity with federal information system security requirements skill knowledge & ability (ska):has a strong interest in building state of the art business intelligence solutions using best practices in data science requires interaction with a diverse population knowledge and skills with tsql sql server relational databases tabular cubes database security disaster recovery database administration scripting languages reporting tools and source control can migrate between test and production virtualized environments excellent documentation and communication skills in both verbal and written formats excellent organizational and time management skills demonstrated value for a positive collaborative friendly work environment ability to present ideas and information in user-friendly language self-motivated and willing to learn strong attention to detail ability to obtain and hold a public trust clearance must have resided in the united states for three of the last five years as part of the hrsa clearance requirements for this role aa eoe of minorities women disabilities veteransjob posted by applicantpro founded in 1998 matrix technology group is an erp and it consulting services provider matrix technology group provides services into erp bi and application development our staff's passion and dedication set us a apart from other it firms our team is dynamic and is focused to our client needs our team is geared to work with consultants and our clients to achieve higher performance we want to work with you and want to welcome candidates who are talented passionate dedicated and have ambition to grow one of our highly esteemed clients has immediate need for strong big data architect (“us citizens and those authorized to work in the us are encouraged to apply we are unable to sponsor h1b candidates at this time”) role: big data architectlocation: austin txduration: long termrequisition details:'big data architect with at least 10 years of overall experience and at least 5 years of hadoop architecting and development experience should have experience in setting up and configuring a medium to large size hadoop ecosystemshould have experience with apache nifistrong knowledge of mapr mandatoryknowledge of other big data allied technologies and distributions - hive pig spark kafka etc familiarity with shiny r and tableau will be added advantagescandidate will need to work very closely with clients data science team and should be able to design the best in class architecture for ingesting large amounts of manufacturing data into a hadoop cluster should have good communication skillsknowledge of data science tools and technologies will be an added advantage in case your skills matches with the above mentioned requirement kindly forward your resume in word format along with rate salary expected and contact number ((feel free to reach me at 908-279-1236 or e-mail me: ssolanki at rate matrixonweb com)) - provided by dicebig data architect hadoop architecting apache nifi mapr hive pig spark kafka job duties: - expertise in report development and the relevant object creation expert in microstrategy administration object deployment and other microstrategy tools - experience in building dashboards document dossier for mobile & web - working knowledge in (either postgres or sql server ) - procedures functions views and db objects - knowledge of db sql etl oracle data migration experience - data warehousing experience - data transformation data wrangling data science what we doat goldman sachs our engineers don’t just make things – we make things possible change the world by connecting people and capital with ideas solve the most challenging and pressing engineering problems for our clients join our engineering teams that build massively scalable software and systems architect low latency infrastructure solutions proactively guard against cyber threats and leverage machine learning alongside financial engineering to continuously turn data into action create new businesses transform finance and explore a world of opportunity at the speed of markets engineering which is comprised of our technology division and global strategists groups is at the critical center of our business and our dynamic environment requires innovative strategic thinking and immediate real solutions want to push the limit of digital possibilities? start here who we look forgoldman sachs engineers are innovators and problem-solvers building solutions in risk management big data mobile and more we look for creative collaborators who evolve adapt to change and thrive in a fast-paced global environment our impactenabling goldman sachs’ research analysts to model investment ideas efficiently and accurately whilst ensuring that clients can access these ideas when where and how they choose gir technologists leverage the latest in cloud mobile and big data technologies to help define and deliver our digital investment research strategy your impactthe goldman sachs investment research division is undergoing a “digital-first” transformation that will lead to a landmark shift in how our research is produced and consumed by our clients you’ll join the team that is a key part of this transformation into a data-driven organization as we build out the underlying data platform for critical business metrics many greenfield opportunities await the successful candidate as we build this platform for gaining insights into business and monetizing our content how you will fulfill your potential- as a software engineer you will be a key contributor in building a green field research data platform to help senior business stakeholders understand our clients? digital research consumption and translate those insights into actionable ideas - there are opportunities for the successful candidate to work on a diverse set of technology platforms and be involved in day to day coding in building the research data platform from scratch and integrating with other firm-wide data platforms technologies - we sit with our business users and are exposed daily to their thought processes allowing us to apply relevant technology solutions to the problems they face basic qualifications- experience in designing data models and pipelines for mission-critical analytics with senior stakeholders- proficient in creating data models optimized for performant extraction (sql nosql and or hadoop hive spark)- familiarity with bi tools (qlikview tableau zoomdata) and visualization apis (d3 js)- proven record of producing robust well-tested solutions and familiarity with the full product lifecycle (requirements gathering implementation through support)- a strong analytical and problem solving mind-set- willingness to collaborate with our business partners to understand and implement their requirements preferred qualifications- experience working on data architecture modeling for enterprise data hub implementations- experience in implementing data transformations in spark sql and or python r- an understanding of current data engineering frameworks to support data science and machine learning efforts- agile methodologies- experience working in an agile environment- experience working in a global teamthe goldman sachs group inc is a leading global investment banking securities and investment management firm that provides a wide range of financial services to a substantial and diversified client base that includes corporations financial institutions governments and individuals founded in 1869 the firm is headquartered in new york and maintains offices in all major financial centers around the world © the goldman sachs group inc 2018 all rights reserved goldman sachs is an equal employment affirmative action employer female minority disability vet company descriptiongohealth has an ambitious mission: to improve the health care system in america achieving this mission relies on hiring and developing great people which is why our team is our top priority we encourage employees to do their best work through innovation and risk taking our environment is fun yet constructive thanks to leaders whose doors are always open and most importantly we’ll never stop investing in you and your career job descriptiongohealth is looking for senior data engineers who will be responsible for the design development execution and delivery of all data needs for the company senior data engineers are technical owners of gohealth's one or more data pipelines and data marts they also support the development of the data infrastructure necessary for full scale data science predictive analytics and machine learning responsibilitieslive the gohealth culture and ensure it is represented within the team collaborate with the other senior data engineers and the rest of the data engineering team subject matter experts and department leaders in understanding analyzing designing implementing and deploying new data-related processes data marts reports and data pipelines lead multi-person projects while managing the data operations establish and improve gohealth's data pipeline and data mart operations ensure data quality and compliance with development architecture reporting and regulatory standards throughout entire data pipeline ability to work with the rest of the data engineering team to cross-train and provide support for other bi tasks such as cube maintenance data analytics and requirements gathering mentor guide and develop data engineers within the team qualificationsbachelor’s degree in computer science or equivalent experience required 8+ years of experience in the design and development of data pipelines data marts and data warehouses excellent analytical and problem solving ability with strong attention to detail and accuracy excellent understanding of data warehousing concepts and dimensional data modeling hands-on experience with troubleshooting performance issues and fine tuning queries strong ability to handle multiple tasks and adapt to evolving business and technical environments self-starter with the ability to work independently take initiative and learn new skills excellent written and oral communication skills with the ability to articulate complex processes to individuals of varying technical abilities experience in software engineering practices firm knowledge of software engineering tools including version control systems using git bitbucket svn or team foundation proficient in at least one programming language: python java go c# ruby c c++ except to write code experience in microsoft sql server ssis ssrs power bi or azure is preferred familiar with other data warehouse platforms like aws redshift or aws data pipeline additional informationgohealth offers a full benefits package an open vacation policy a very casual dress code and a fun interactive working environment remote work optionabout the companyour client is a leader in providing specialized health management services to health plans employers and unions through a network of licensed and company managed health care providers as part of their industry expertise they use advanced data analytics to identify health plan individuals who may be avoiding specialized and individualized healthcare treatment their integrated substance dependence program is designed to address substance dependence as a chronic disease the program seeks to lower costs and improve member health through the delivery of integrated medical and psychosocial interventions in combination with long term care coaching and online tools to aid members in their recovery why work hereability to work remotegreat pay and excellent benefits (healthcare 401k holidays vacation sick days etc )company culture encourages growth and promotionsemployees routinely recognized for their efforts and supported educational opportunitiessupportive and knowledgeable co-workerspositive work environmentcompany that provides help to people who otherwise might not receive proper treatmenta company that is rapidly growing innovative and dedicated to creating a world-class cultureclose-knit team and work atmospherecutting-edge use of data analytics for innovative care models that achieve program successleadership shares vision achievements and goals and welcomes your ideas and feedbackmeaningful work environment where your efforts are helping improve other people’s livespaid travel expenses and paid trainingabout your roleprimary responsible for data stewardship strategy and roadmap creation including data design integrity capacity planning storage management movement security etc across the enterprisedesign and management of the production databasesresponsible for the data architecture within the organization including the hadoop data pipe mdm etl operations operational and business analytical reporting and operational efficienciescollaboration with the application development team and the data science team in support of the production and research development objectivespropose database solutions through potential systems designsresponsibilities ownership of the configuration and maintenance of all data stores in production staging and qa environmentsproviding insight for the performance and structure of the hadoop data pipelineproactively monitor events investigate issues analyze solutions and drive problems through to resolution using a wide variety of ops tools and monitoring platforms to gain knowledge understanding and enable persistent monitoring of database availability performance and capacityestablish and maintain accounts and access controls to all production databasesdata groomingimplementing retention policiesmaintaining the database monitoring systems and developing new metrics monitoring dashboards as additional coverage events become necessaryproviding insight into the storage and utilization processes within enterpriseproviding development support for data model and query pattern best practices qualificationsexcellent understanding of hadoop clusters in a production environmentexperience with hive is preferredknowledge of and experience with cloudera toolsexperience with disaster recovery best practices relative to data replication and maintaining high availability uptimeability to work collaboratively in a fast-paced entrepreneurial environmentexperience working with agile methodologiesexcited by big data technologies and interested in integrating statistics and analytics to create peak system performanceknowledge of cloud technologies summaryall lines technology is seeking a big data architect in this role the big data architect will lead the architecture and design data platform in alignment with providing a unified data solution: aggregating customer and healthcare operational data and providing easy access to powerful insights as such this position will be responsible for the technical and security architecture of the software applications as well as the supporting infrastructure this position will help drive quality reliable secure applications using industry standard best practices primary duties and responsibilitieslead a development team of big data designers developers data scientists and devopsimplement a big data enterprise warehouse bi and analytics system using hive spark redshift emr (hadoop) and s3develop and maintain processes to acquire analyze store cleanse and transform large datasets using tools like spark kafka sqoop hive nifi and minifiprovide recommendations technical direction and leadership for the selection and incorporation of new technologies into the hadoop ecosystemparticipate in regular status meetings to track progress resolve issues mitigate risks and escalate concerns in a timely mannercontribute to the development review and maintenance of product requirements documents technical design documents and functional specificationshelp design innovative customer-centric solutions based on deep knowledge of large-scale data-driven technology and the healthcare industryhelp develop and maintain enterprise data standards best practices security policies and governance processes for the hadoop ecosystemeducationbachelor’s in computer information technology computer science management systems or related discipline required; master’s preferredexperiencefour-year degree in computer science software engineering or related degree program or equivalent application development implementation and operations experience advanced study or degrees such as master’s degree in business (mba) masters phd in computer science software engineering or a related scientific degree program is preferredminimum 5+ years of experience in large systems analysis and development addressing unique issues of architecture and data management has the experience to work at the highest technical level of all phases of systems analysis and development activity across the full scope of system development cycle4+ years related experience on data warehousing and business intelligence projects3+ years implementation or development experience with the hadoop ecosystemworking knowledge of the entire big data development stackexperience handling very large data sets (10’s of terabytes and up preferred)experience with secure restful web serviceshighly proficient with java scala application developmentexpert in apache spark infrastructure and developmentexperience with sqoop spark hive kafka and hadoopexperience with automated testing for big data platformsexperience with best practices for data integration transformation governance and data qualityexperience with developing designing and coding completing programming and documentation and performing testing for complex etl applications (spark & scala preferred)experience with agile software development process and development best practicesexperience with big data text mining and big data analytics preferredunderstanding of big data architecture along with tools being used on hadoop ecosystemability to lead tool suite selection and lead proofs of conceptsability to share ideas among a collaborative team and drive the team based on technical expertise and sharing best practices from past experience skillsstrong understanding and experience executing several software development methodologies and life cycles ability to understand and translate business requirements into technical specifications ability to negotiate with and influence senior management ability to lead and influence across departments and across levels of leaderships both internally and with customers proven ability to organize manage multiple priorities coupled with the flexibility to quickly adapt to ever-changing business needs excellent written and oral communication skills adept and presenting complex topics influencing and executing with timely actionable follow-through strong analytical and problem-solving skills with the ability to convert information into practical training deliverables uses rigorous logic and methods to solve difficult problems source one is seeking a data management professional to support our procurement consulting firm based in willow grove pa a qualified candidate should demonstrate understanding and expertise in classifying and managing large data sets this role requires a unique blend of research data entry data manipulation organizational skills and logical problem solving our clients engage us for guidance expertise and solutions to help transform their procurement and supply chain groups within their organization one of our firm’s growth areas has been data science including spend analysis e-procurement and procurement department “health” metrics spend analysis is the practice of reviewing a client’s expenditures suppliers general ledger and other sources of information in order to classify their spending habits into category structures (taxonomy) e-procurement is the facilitation of bid collection using an online platform this includes collection and analysis of data from source one’s proprietary e-procurement site whyabe procurement department “health” metrics are the collection and analysis of internal and client data points intended to track the performance of a procurement organization we’re looking for an individual that understands the importance of a master data set understands to challenges of redundant inaccurate or misclassified data and is able to take ownership of large datasets spanning numerous clients and procurement tools the ideal candidate will have the ability to research what data means and draw conclusions on how it should be classified as well as being able to create organizational structure of datasets back into a master data repository candidate qualificationskey responsibilitiescleanse & categorize data sets into our proprietary procurement taxonomyextract transform and load (etl) master data manage and process data files (in excel access etc ) for interfacing departments with respect to master data manage the proper and efficient importation of new data into internal and external datasetsreview research and understand supply markets and supplier capabilities technically and geographicallybuild and maintain business rules for future automated classificationmanage multiple large datasets and periodically perform mass data updates as necessarycoordinate with it analytics teams to identify and implement process improvement solutionsinitiate corrective action to address issues or discrepancies in the data collection and or data processing practicesunderstand database management best practices and develop data quality standards and data control sets for data in motion and data at rest perform data quality assessments and root cause analysis; develop charts graphs and tables required qualifications bachelor’s degree or higherus citizen or lawful permanent resident (green card holder)demonstrated understanding of tools used for data transformation including but not limited to exceltechnical and functional knowledge of data transformation project including planning and executionfamiliarity with procurement processes and strategic sourcing principals preferred qualifications bachelor’s degree or higher in2+ years’ experience in professional services consulting procurement strategic sourcing database management specifically spend analysis and e-procurement platformssql experienceother opportunitiesadditionally an opportunity for growth into our consulting practice exists for candidates that wish to pursue that path www jobs sourceoneinc com cds global the leader in delivering marketing data and e-commerce solutions to the publishing industry is looking for an outstanding data modeler to help us grow and enhance the analytics of the media industry by using 174 million customer database and some of the most powerful marketing automation tools available we are at the heart of an industry transformation partnering guiding and building solutions with some of the biggest media brands in the world as they move from traditional print and digital services to omni-channel delivery of content services and products on a 1:1 level reporting to the director of bi ba and data science the data modeler will be a hands-on leader who leverages our clients datasets to provide descriptive analytics and predictive analytics including response models churn analysis forecasting using the arima method lifetime customer value and more with a demonstrated ability to balance business technical and people skills the role will be best suited for an individual with the ability to multi-task across complex projects and diverse solution teams detailed responsibilities include: maintains confidentiality of cds global and its clients proprietary information originates and executes statistical research projects of very complex scope on behalf of cds global or cds global clients conducts complex statistical modeling and data mining techniques: exploratory data analytics glm clustering decision trees etc researches new ways of using statistical methodologies develops new approaches to solving business problems using advanced statistical techniques; brings new ideas on statistical techniques and their application to the business investigates and assists in data analysis interprets data and identifies correlations using both multivariate analyses and machine learning offers recommendations to improve analyses; assists in documenting and summarizing results leverages and develops effective clear data visualization methods to represent complex data performs extensive diagnostic checks of assumptions supporting statistical inferences writes reports summarizing findings for a broad-based audience provides statistical assistance support and guidance to managers business analysts and other staff works and leads technical and non-technical team members leverages complex data and statistical analysis in order to discover and provide clients with actionable business insights and opportunities interacts and consult with clients in order to drive product strategy leverage best practices in the use of traditional and digital data work with a variety of statistical analysis and data visualization tools using various resources learns and applies knowledge of current statistical software databases and data visualizations techniques performs management and supervisor functions for employee(s) who perform statistical analysis responsibilities provides training and guidance to employee(s) oversees and manages third-party statistical contractors interacts in cooperative and professional manner with all levels of employees vendors and or clients in team environment qualifications: masters degree in statistics applied mathematics or related program of study equivalent education and or experience may be substituted for the minimum education requirement five or more years of successful progressively complex working experience to obtain comprehensive knowledge of statistical theory ability to independently develop new methodology to meet unique requirements such as statistical models in new areas of application and derivation of inference techniques usage of statistical methods with appropriate tests of assumptions numerical accuracy and correct interpretation of results experience with microstrategy or tableau for reporting and visualization development preferred advanced knowledge to read complex statistical and mathematical literature understand appropri big data architect lead solutions architect hadooppittsburgh paour client in pittsburgh pa is looking to add a full-time direct hire big data architect to their growing software development team primary dutieslead a development team of big data designers developers data scientists and devopsimplement a big data enterprise warehouse bi and analytics system using hive spark redshift emr (hadoop) and s3develop and maintain processes to acquire analyze store cleanse and transform large datasets using tools like spark kafka sqoop hive nifi and minifiprovide recommendations technical direction and leadership for the selection and incorporation of new technologies into the hadoop ecosystemparticipate in regular status meetings to track progress resolve issues mitigate risks and escalate concerns in a timely mannercontribute to the development review and maintenance of product requirements documents technical design documents and functional specificationshelp design innovative customer-centric solutions based on deep knowledge of large-scale data-driven technology and the healthcare industryhelp develop and maintain enterprise data standards best practices security policies and governance processes for the hadoop ecosystemrequirements and skillsbachelor’s degree in computer information technology computer science or related discipline required;master’s degree in business (mba) masters phd in computer science software engineering or a related area preferred5+ years of experience in large systems analysis and development4+ years related experience on data warehousing and business intelligence projects3+ years implementation or development experience with the hadoop ecosystemadditional technical knowledge in the following areas:experience handling very large data setsrestful web servicesjava scala application developmentapache sparksqoop spark hive kafka and hadoopautomated testing for big data platformsagile software development process and development best practicesexperience with big data text mining and big data analytics preferredexperience executing several software development methodologies and life cycles ability to understand and translate business requirements into technical specifications proven ability to organize manage multiple prioritiesexcellent written and oral communication skills strong analytical and problem-solvingbig data architect lead solutions architect hadoop - 15838 we have urgent requirement for ai architect big data architect for one of our clients at hartford ct for contract role ai architect big data architect skills - ai aa big data cloud solution architect– required skillsmodel as a service - scaling self training of models microservices architecturereal-time model productionalizing modelsapply machine learning techniques on variety of datasets to build & enhance predictive models data engineering operationalizing models analyticsexperience building predictive models with programming languages like python r sas or sql tableau experience with building dashboards and reports using visualization software (tableau and qlikview)experience bringing prototypes to production on hadoop based and or other nosql platformsability to build scalable systems to analyze huge data sets and make actionable recommendationsarchitecture and applications on or incorporating innovative data analytic technologies such as hadoop spark r and other open source and proprietary data analytics platforms and toolsexperience with open source data science tool kits and big data & distributed computing technologies such as apache hadoop apache spark hive?ability to elevate and switch to work on ai big data innovation (cloud aws) as needed be more proactive and driving architectural recommendations with clients - provided by diceai aa big data cloud solution architect scaling self training of models microservices architecture python r sas or sql tableau nosql hadoop spark r hive big data developer- bloomfield ct**job details:**+ location:charlottesville va+ salary:$65 - $75 per hour+ date posted:wednesday march 21 2018+ job type:contract+ industry:information technology+ reference:609711**job description**duties:the candidate will play a technical leadership role in the information life-cycle management (ilm) area responsible for driving the adoption of software engineering best practices and innovative product introduction as well as implementation the individual must have superb analytical and technical skills coupled with the ability to deliver effectively the individual will be expected to participate in the collaborative concept definition architectural refinement design and realization of products that support the strategic needs within ilm and the information management organizations the applicant will be working across multiple scrum teams that demands engineering and technical excellence; whose members are expected to hold each other accountable in the overall success of the output focus for this opportunity is to delivery on innovative solutions to complex problems but also with a mind to drive simplicity in the further refinement and support of the solution by others the focus of this individual will be to create in a collaborative fashion efficient and effective strategies and solution architectures related to solve problems in the real world we are building legal regulatory and cyber solutions using big data tools to improve on the analytics while leveraging artificial intelligence (ai) and machine learning (ml) we're looking for a big data software engineer to join our team through a hands on engineering approach you engineers patterns across 8-10 scrum teams to tackle the big data and data science problems you will understand business requirements and work with cross-org teams you will influence all aspects of the system from data ingestion of rich data source utilizing big data methodologies to solving business problems of improving company cyber security position and information life cycle management skills:+ ask smart questions take risks and champion new ideas + business oriented and able to communicate at all levels+ provide technical expertise through a hands-on approach to teams and projects developing big data solution + participate hands on doing software engineering our software development best practices + ensure adherence to existing big data direction and architectural strategies + conduct software engineering code reviews and deliver innovative software tools technologies and application frameworks maintain deep levels of involvement in the implementation process + function as a hands-on member of the ilm teams - guiding mentoring development team to design document develop deploy and maintain applications + at least 8+ years of experience in software development and best practices+ at least 4+ years of big data experience with different languages - strong programming skills - c++ java python scala along with ability to pick up new languages - past demonstrable programming work - exposure to big data methods - hadoop hive spark mapreduce+ understanding of cloud based system distributed systems+ strong expertise with git + ability to perform analysis of business problems and technical environments + proven track record of success in challenging the status quo implementing new ideas and designs with a practical orientation + ability to think strategically and implement iteratively ability to estimate financial impact of design architecture alternatives + strong teamwork and collaboration skills + solid oral and written communication skills + experience of deploying and managing large-scale cloud-based solutions + track record and a passion for being a team player you take the lead when needed and also coach and develop others when needed + knowledge of graph- or stream-based analytics or deploying machine-learning analysis + experience creating benchmark tests designing for scalability and performance and designing integrating large-scale systems + experience designing and building n-tier architecture-based applications + you are confident in a devops environment engineering products to be both reliable and adaptable + you are frustrated if you are not being stretched or learning something new + be passionate about resolving user pain points through great design + be open to receiving feedback and constructive criticism + be passionate about all things big data of design and innovation research and showcase knowledge in the industry's latest trends and technologies education:bachelors degree in computer science or a related discipline at least eight typically ten or more years of solid diverse work experience in it with a minimum of eight years experience of software engineering work experience languages:englishreadwritespeakattachments:skills and experienceskills:required+ hadoop+ python+ devopsadditional+ distributed systems+ engineer+ git+ hive+ java+ large-scale+ life cycle+ machine learning+ mapreduce+ n-tier+ n-tier architecture+ scala+ software development+ software engineer+ software engineering+ apache hadoop mapreduce+ b2b software+ dev ops+ mentoring+ structured software+ team player+ technical leadership **date:** mar 20 2018**location:** new orleans la us**company:** entergy**about entergy**entergy corporation is an integrated energy company engaged primarily in electric power production and retail distribution operations entergy owns and operates power plants with approximately 30 000 megawatts of electric generating capacity including nearly 9 000 megawatts of nuclear power entergy delivers electricity to 2 9 million utility customers in arkansas louisiana mississippi and texas entergy has annual revenues of approximately $11 billion and more than 13 000 employees learn more about our corporate utility nuclear power generation gas & transmission businesses here: http: www entergynewsroom com about-us **primary location:** louisiana-new orleans**job function** :information technology**flsa status** :professional**relocation option:** approved in accordance with the entergy guidelines**union description code** :non bargaining unit-nbu**number of openings** :1 00**req id:** 77943**travel percentage** :up to 25%**job summary purpose**this it position is a key role in the enterprise analytics team supporting the enterprise data lake ecosystem the data modeler – enterprise analytics plays an integral role in building a holistic view and roadmap of the company’s technology strategy processes and information technology roadmap the data modeler partners with both business and technology groups to ensure that the proposed technical solutions align with the company’s overall mission vision strategy goals and objectives this role data modeler – enterprise analytics will be responsible for the data modeling and metadata management portion of the enterprise data lake ecosystem as such this role will be expected to develop and maintain entergy’s enterprise data lake data and metadata catalogues and mappings plus design access usage and stewardship also included is the development or use of process models interface designs and development of internal and external checks and controls to ensure proper governance security and quality of data assets this position requires strong technical and communication skills as well as proven experience in data management information management big data strategy and planning information modeling and delivery agile implementation business collaboration program management and project management and the utility industry in general this role will report directly to the it service pod manager – enterprise analytics plus there is a matrixed relationship to the enterprise analytics team this role will directly interact with other roles such as: solution architect data architect data wrangler and data scientist + definition refinement ownership and representation of the enterprise data lake reference architecture including:+ data supply & integration architecture tools and platforms+ analytics delivery architecture tools and platforms+ representation and alignment of the enterprise data lake reference architecture to enterprise and local analytics architecture teams+ manage and coordinate new information demand that impacts the enterprise data lake reference architecture+ capture business capability requirements functional requirements and expected service levels from business units+ establish design guidelines data integration performance reliability operating and security designs support business units in the creation and implementation of project use cases+ maintain an understanding of data and metadata needs for business units**job duties responsibilities****general:** working under the direction of the it service pod manager for enterprise analytics; translate project goals into usable data models to guide project solution development and achieve consistency of information assets across the entire application portfolio simply stated -- leads the data modeling activity for the enterprise data lake ecosystem across multiple content types (structured data semi-structured data and unstructured data) -- inclusive of data management information management and analytics solutions participates in the development of enterprise analytics solution strategy and the identification and design of it architectures to support emerging business strategic intent (e g big data management and analytics) **architecture:** establishes data modeling standards and best practices designs and implements the enterprise data lake data models working closely with the it enterprise architecture team plus other data lake and analytics teams translate the enterprise data lake reference architecture into an operational ecosystem works closely with the data architect to design and implement enterprise data lake solutions responsible for the overall design and build of the enterprise data lake data model domain -- inclusive of data management information management and analytics solutions **data management technologies:** responsibilities also include the creation or use of enterprise data management processes models and technologies; data interface designs and development of internal and external checks and controls to ensure proper governance and quality of data assets inclusive of enterprise methods and standards as needed lead or participate in poc investigative and research projects **roadmap:** participate in data strategy and road map exercises data architecture definition business intelligence data warehouse product selection design and implementation**sdlc:** work through all stages of a data solution life cycle: analyze profile data create conceptual logical & physical data model designs architect and design etl reporting and analytics solutions**matrix collaboration:** works collaboratively with all it and business teams; works collaboratively with the enterprise analytics team enterprise architecture team plus other business analytics teams leads the oversight of other data modelers as needed works closely with the data architect to design and implement enterprise data lake solutions works closely with the database management admin (dba) team to design and implement enterprise data lake solutions works closely with the data governance team to help manage and support enterprise data lake data management solutions works closely with all teams to support and maintain metadata libraries and catalogs **consulting:** provide primary advisory and consulting services for technical aspects of enterprise analytics solutions and applications (entergy’s subject matter expert on the enterprise data lake technology stack and associated data architectures) be a thought-leader at entergy for solving data quality availability issues and work with the data scientists and citizen data scientists throughout entergy various business units participates in the development of enterprise analytics solution strategy **minimum requirements**minimum **education** required of the positionbachelor's degree in related field such as business engineering or it (or equivalent work experience) mba or graduate degree in it or engineering or relevant discipline preferred minimum **experience** required of the positionminimum of 8+ years in information technology experience required experience in large scale or enterprise projects including data architecture and or analytics leadership experience (e g data architect analytics architect solution architect level experience) experience with big data electric utility and customer systems preferred minimum **knowledge skills and abilities** required of the position+ strong interpersonal collaboration leadership and analytical skills + demonstrated strong work ethic and exceptional levels of accountability self-drive and business judgment+ excellent oral and written communications skills + strong presentation development and delivery skills + experience with information technology implementation and or systems development and implementation life cycles from both a technical (it) and functional business data perspective + working knowledge of enterprise application and infrastructure platforms development methodologies and industry best practices + in-depth experience designing and implementing information solutions; previous smart grid or big data experience a plus + knowledgeable in the design and construction of information architectures especially collaborative and analytical systems + data information modeling experience at the enterprise level + understanding of common information architecture frameworks (such as zachman and togaf) + understanding of differences of conceptual logical physical data modeling + understanding of taxonomies and ontologies as well as the methods of managing structured data semi-structured data and unstructured data + ability to effectively adapt to rapidly changing technology and apply it to business needs + ability to establish and maintain a high level of customer trust and confidence + strong analytical and conceptual skills; ability to create original concepts theories for a variety of stakeholders + ability to analyze project program and portfolio needs as well as determine resources needed to meet objectives and solve problems that involve remote and elusive symptoms often spanning multiple environments in a business area + working knowledge of project planning and management including scope time and resource management + experience creating information policy to support effective design of information systems and use of information across the enterprise + experience wi are you looking for a patient-focused company that will inspire you and support your career? if so be empowered to take charge of your future at takeda join us as a *technology manager* *data science amazon web services* in our *cambridge ma* office here everyone matters and you will be a vital contributor to our inspiring bold mission as a*technology manager *you will be empowered to provide leadership and expertise and a typical day will include:***position objectives:** provides leadership & expertise in managing the amazon web services data lake environment including: data ingestion staging quality monitoring and modeling * establishes analytic environments required for structured semi- and unstructured data that includes a cloud based infrastructure with data processing and visualization capabilities * develops documents and implements best practices for big data solutions *accountabilities:** maintains oversight of platforms technology stack and operations for aws data lake * manages projects and customer relationships of critical importance * integrating big data analytics with enterprise systems dw and data marts a strong plus* drives rapid prototyping and designs for projects and analytic r&d environments * maintains current knowledge of big data & iot developments opportunities and challenges* develops and maintains technology and platform roadmap* develops advanced leading-edge technologies and or concepts * strategically supports key business objectives * develops innovative solutions systems and products to support objectives across multiple business functions * builds external alliances to gain and share information and industry trends *education behavioural competencies and skills: ** bs in computer science engineering statistics applied math or equivalent* 10 years of relevant work experience 3 years of experience designing and developing cloud based solutions (preferably through aws) * proficiency in using r matlab python other statistical modeling packages with a focus on machine learning; experience with nlp a plus * proficiency in developing production systems for processing large volumes of structured and unstructured data previous experience with hadoop technology stack (map reduce hive pig etc) * solid understanding of etl architectures database technologies performance optimization and building communication channels between structured and unstructured databases * track record of success required including effectively leading and managing diverse business functions and multiple projects with a variety of stakeholders* excellent oral and written communication skills* excellent analytical and decision making skills*travel requirements:** willingness to travel to various meetings corporate partners additional takeda sites * requires approximately 10% travel *what takeda can offer you:** 401(k) with company match and annual retirement contribution plan* tuition reimbursement* company match of charitable contributions* health & wellness programs including onsite flu shots and health screenings* generous time off for vacation and the option to purchase additional vacation days* community outreach programs*empowering our people to shine*learn more at *takedajobs com* *takeda is an eeo employer of minorities women disabled protected veterans and considers qualified applicants with criminal histories in accordance with applicable laws for more information visit **http: www takeda us careers eeo_policy_statement aspx**no phone calls or recruiters please ***#li-jd1**job:** **research and development***title:** *big data platforms manager***location:** *ma-cambridge***requisition id:** *1701060* quantumblack helps companies use data to drive decisions we combine business experience expertise in large-scale data analysis and visualization and advanced software engineering know-how to deliver results from aerospace to finance to formula one we help companies prototype develop and deploy bespoke data science and data visualisation solutions to make better decisions as a principal data engineer you're passionate about data and the opportunity it provides to organisations you get big data and cloud computing for more advanced data processing and analytics and are excited about these technologies you are equally comfortable talking to senior client stakeholders to understand their data as well as designing the ingestion process to store the data locally and preparing it for data analytics you have experience lead client projects and in handling vast amounts of data working on database design and development data integration and ingestion designing etl architectures using a variety of etl tools and techniques you are someone with a drive to implement the best possible solutions for clients and work closely with a highly skilled data science team what you'll dolead on projects from a data engineering perspective working with our clients to model their data landscape obtain data extracts and define secure data exchange approachesplan and execute secure good practice data integration strategies and approachesacquire ingest and process data from multiple sources and systems into big data platformscreate and manage data environments in the cloudcollaborate with our data scientists to map data fields to hypotheses and curate wrangle and prepare data for use in their advanced analytical modelshave a strong understanding of information security principles to ensure compliant handling and management of client datathis is a fantastic opportunity to be involved in end-to-end data management for cutting edge advanced analytics and data sciencerequirementscommercial experience leading on client-facing projects including working in close-knit teamsexperience and interest in big data technologies (hadoop spark nosql dbs)experience working on projects within the cloud ideally aws or azurea proven ability in clearly communicating complex solutionsexperience working on lively projects and a consulting setting often working on different and multiple projects at the same timestrong development background with experience in at least two scripting object oriented or functional programming language etc sql python java scala c# rdata warehousing experience building operational etl data pipelines across a number of sources and constructing relational and dimensional data modelsexperience in at least one etl tool (e g informatica talend pentaho datastage)the ability to work across structured semi-structured and unstructured data extracting information and identifying linkages across disparate data setsexcellent interpersonal skills when interacting with clients in a clear timely and professional manner a deep personal motivation to always produce outstanding work for your clients and colleaguesexcel in team collaboration and working with others from diverse skill-sets and backgrounds job descriptiontranslate complex functional and technical requirements into detailed designproficiency in spark for technical development and implementationability to utilize hadoop spark cassandraexperience in deep learning tensor flow keras theano and pythoncontribute and adhere to the best engineering practices for source control release management deployment etc participate and facilitate in production support job scheduling monitoringqualificationsb s and m s in mathematics computer science or engineering5+ years of python java scala 3+ years of demonstrated technical proficiency with spark big data projects and experience in data modeling experience in implementing machine learning models using tensorflow keras theno spark mllib writing high-performance and reliable codes spark scripts and implement kafka and flume topics processesgood knowledge of database structures theories principles and practicesanalytical and problem solving skills applying to big data domainunderstanding and experience of utilizing hadoop pig hive and spark elastic search etc good aptitude in multi-threading and concurrency conceptsjob type: contract - provided by dice spark python cassandra mllib data science big data architect - chicago il - $165kthe ideal candidate has a strong entrepreneurial spirit and the ability to teach and mentor their peers this individual has the ability to advise clients and to work as a key member of the delivery team the role will require deep hands-on technical skills in designing developing and implementing large-scale big data solutions if interested please read the description below requirements engage throughout the sales cycle to demonstrate organizational and personal expertise in a specific line of service area build new thought leadership and the point-of-view on key topics that align with client needs support new client services and competency areas lead the consultative development of tailored solutions to business problems that leverage thought leadership capabilities and our understanding of the client situation 5+ years of experience in data warehousing operational data stores and large scale architecture and implementation experience in solutions crafting and developing proposals to illustrate business value provided by technical solutions excellent communication and presentation skills previous professional consulting experience expertise in big data technologies in hadoop ecosystem - hive hdfs mapreduce yarn kafka pig oozie hbase sqoop spark etc hands-on experience with scala python and r benefits excellent health dental and vision insurance revenue sharing and a 401(k) retirement savings plan life disability and long-term care insurance little to no travel robust career development and training contact: myrka veloz at 212-731-8282 (ext 3352) apply immediately for consideration interviews are currently taking place! please send your resume to m veloz@frgconsulting com frg consulting is a leader in niche it recruitment with a focus on cloud technologies including azure & aws as well as big data data science and bi we deal with both microsoft partners aws partners & end users throughout north america we have open positions and relationships with some of the top partners and end users throughout the us and offer some excellent opportunities in the bi big data space i understand the need for discretion and would welcome the opportunity to speak to any big data and cloud analytics candidates that are considering a new career or job either now or in the future confidentiality is of the upmost importance for more information on available bi jobs as well as the business intelligence big data and cloud market i can be contacted at 1-212-731-8282 big data data science hadoop microsoft business intelligence bi business intelligence ssrs ssas ssis sql t-sql mdx azure cloud aws data warehouse etl power bi architect big data hadoop scala python apache hive spark ms azure amazon web services aws emr rds redshift s3 ec2 lambda glue data pipeline kafka power bi data lake data lake analytics azure iot hubs blob storage hdinsight data factory machine learning sql data warehouse steam analytics ruby dynamodb kinesis - provided by dice big data architect - chicago il - $165k big data developer global data & content who we arefounded and continuously led by inventor and entrepreneur tony aquila solera is a global leader in digital technologies that connect and secure life's most important assets: our cars homes and identities since its inception in 2005 as a garage-based startup solera has grown aggressively with over 50 acquisitions across its platforms the company's current product solutions include audatex autodata autopoint cap hpi colimbra digidentity enservio explore data hollander identifix inpart lynx and titletec as well as the company's flagship digital garage application today solera processes over 300 million transactions annually for approximately 235 000 partners and customers in over 80 countries unified by a strong culture that values uncommon entrepreneurial thinking and continuous "do-it-different" innovation solera's global workforce of 6 700+ associates come from diverse forward-thinking industries that include automotive technology artificial intelligence software development data sciences cybersecurity cognitive design and digital identity protection for more information please visit solera com this position (dallas tx) is with solera global data & content team headquartered in dallas tx and madrid spain our team is responsible for making sense of data and providing insights to various business groups within the company we use foundational open source technologies to move solera forward for more information go to: soleragdc com http: www solera com careers what you’ll be doingdata engineering: create data pipelines (batch and streaming using spark kafka concourse) to bring automotive collision service marketing data from 50+ data sources from 80+ countries develop a re-usable modular library for data ingestion transformation and deliverown and refine bounded data contexts including data models in avro and integration with the wider data environment based on the business context and needs are you qualified? must have:proficient in linux unix environmentthe position requires at least 5 years of experience in software development2 years of experience with the hadoop ecosystemexperience with big data technologies such as hadoop stack kafka sparkdeep knowledge of java or scalaextensive knowledge of design patterns and solid principlesstrong data modelling skillsexperience with relational databases and knowledge of nosqlcontinues integration delivery deployment tools and methodologies: maven git nexus bamboo jenkins sonar strong with agile development concept and experience working in agile scrum environmenthighly desirable:experience with stream processingexperience with hortonworks distributionexperience with solr or elastic searchdegree in computer science or other numerical disciplinebenefits: relocation available for qualified candidateshighly competitive pay and health & wellness plans401ktuition reimbursementno b s policy that promotes transparency and accountabilitybeautiful and uncommon workspaces to collaborate and unwindfree gym membership (to the awesome gym that’s right next to our office)free meals healthy snacks (like nuts and yogurt parfaits) some indulgent snacks (like baked chips and dark chocolates) and refrigerators full of juices teas and other life-essential beverages (including red bull)the latest and greatest in all things technologylots and lots of awesome carsthe solera way solera’s uncommon culture is based on three simple principles: think 80 20 (focus) act 30 30 (efficiency) and live 90 10 (accountability) we define our mindset using our 3h’s: humility a hunger to succeed and a desire to hunt for opportunities to win we train our volunteers to engage with each other modulating between their intellect (iq) and emotional intelligence (eq) using our 3fs: facts finesse and force solera has become a global technology leader that is constantly growing in the double digits the principles drills and values associated with the solera way have been fundamental to solera’s success and our ability to grow continuously change and innovate are you uncommon? we’re on the hunt for an experienced big data developer who ranks in the top quartile among their peers someone who has a highly competitive and entrepreneurial mindset that is wired with a team-first attitude has no problem rolling up their sleeves to execute their missions and can modulate between leading and following as needed you will serve as the big data developer the role is based at solera offices in westlake (dallas fort worth area) this role exists within a team that develops global solutions company descriptioniqvia™ is the human data science company™ focused on using data and science to help healthcare clients find better solutions for their patients formed through the merger of ims health and quintiles iqvia offers a broad range of solutions that harness advances in healthcare information technology analytics and human ingenuity to drive healthcare forward our predictive analytics team within the real world insights (rwi) technology division is a fast growing group of collaborative enthusiastic and entrepreneurial individuals in our never-ending quest for opportunities to harness the value of real world evidence (rwe) we are at the centre of iqvia’s advances in areas such as machine learning and cutting-edge statistical approaches our efforts improve retrospective clinical studies under-diagnosis of rare diseases personalized treatment response profiles disease progression predictions and clinical decision-support tools job descriptionwe are looking for software engineer with expertise in apache spark to join our software team the candidate will implement in scala to process data for use in a commercial data warehouse based on aws + other partner solutions the candidate would have good design sense that will help very large datasets be consumable by a range of disciplines the candidate will contribute to components of our data warehouse platform a range of data warehouse related software such as data modeling setting up relational and dimensional databases transforming data using scala and writing advanced sql based scripts and tools communication with different disciplines will be an important aspect as we have teams in multiple geographies the importance of oral communication and being a positive team player is just greater qualificationsrequired •apache spark expertise especially in scala•expert level knowledge of sql data modeling and database architecture•hands on experience with java•hands on experience with basic scripting languages including shell script•working experience of databricks or similar•knowledge of concepts such as data warehouse star schemas kpis•hands on experience with linux hdfs hive hadooppreferred•excellent written and communication skills•be able to organize ideas and technical details; and present work clearly through the product lifecycle•ability to work in an agile environment•experienced in designing and implementing data layers and middle tiers aimed at business intelligence applications•degree in computer science or a related subject•experience with columnar data sources such as redshift parquet etc •hands on experience with awsadditional informationwe know that meaningful results require not only the right approach but also the right people regardless of your role we invite you to reimagine healthcare with us you will have the opportunity to play an important part in helping our clients drive healthcare forward and ultimately improve human health outcomes whatever your career goals we are here to ensure you get there!we invite you to join iqvia™ dish is a fortune 200 company with more than $15 billion in annual revenue that continues to redefine the communications industry our legacy is innovation and a willingness to challenge the status quo including reinventing ourselves we disrupted the pay-tv industry in the mid-90s with the launch of the dish satellite tv service taking on some of the largest u s corporations in the process and grew to be the fourth-largest pay-tv provider we are doing it again with the first live internet-delivered tv service – sling tv – that bucks traditional pay-tv norms and gives consumers a truly new way to access and watch television now we have our sights set on upending the wireless industry and unseating the entrenched incumbent carriers we are driven by curiosity pride adventure and a desire to win – it’s in our dna we’re looking for people with boundless energy intelligence and an overwhelming need to achieve to join our team as we embark on the next chapter of our story opportunity is here we are dish the big data development team processes over a billion incoming data records each day we use a cloudera hadoop cluster with 1000 cpus and 1 5 petabytes of data capacity to provide cutting edge solutions to virtually every line of business at dish we are looking for a java hadoop architect developer to join our team here in englewood co and help lead us to best in class performance primary responsibilities fall into the following categories:ability to translate high-level business requirements into detailed designaptitude to identify create and use best practices and reusable elementsability to solve practical problems and deal with a variety of concrete variables in situations where only limited standardization exitsstrong desire to learn a variety of technologies and processes with a "can do" attitudeexperience guiding and mentoring 2-5 developers on various tasksability to own a complete functional area - from analysis to design to development and complete support#li-kk1 denver co#cb1a successful hadoop developer architect will have the following:4 year college degree or equivalent experience bachelor of science preferred and 6+ years of professional development experience and or equivalent combination of education and work experiencejava j2ee object oriented pattern based development experience is requiredexperience with cloudera hadoop hive impala kafka informatica etl data prep tools time series data processing and data science techniques are a definite plusexperience with xml json sql unix eclipse ip network protocols and delivering projects in agile environment using scrum xp methodologiesworked in a fast paced environment with focus on test driven development and ci cdsource code management experience (subversion pvcs maven etc ); preferably web services development experience; pl sql development oracle sql server sr db developer perm – will be ideally leading a team with a small but growing team in minneapolis ideal background:don't need to have strong c# experience but in most cases a senior level person should be able to discuss oo programming want someone very fluent in the database side strong communication skills - can go in front of a clientthis person should have well rounded exp across all of the areas of data flowing through the company: writing queries for reports etl data warehouse etc do they have the ability to touch each area as data flows through the company majority of our apps are 3rd party which means gaps in functionality in what we can can't do this person will handle the governing of all external facing data they will be helping structure what the data marts need to look like work with data science team to gain a strong idea of where the data needs to go lots of growth possibility team is newer and growing *description:*leidos is seeking a big data architect to join our corporate computer information systems group *this position can be supported as 100% telecommute *in this role you will work closely with director of cyber transformation to shape and implement the architecture for a cyber streaming data analysis platform you will bring your experience running elk and hadoop cluster architectures to build a resilient flexible and stable system you will get to work together with best-in-industry engineers and data scientists to turn data into knowledge you will also lead the team’s day-to-day working activities and be responsible for deliveries of key milestones primary responsibilities:- shape and implement big data architectures- build high-efficiency real-time data pipelines for ingest and processing- ensure the architecture supports all functional and non-functional requirements- create briefings for the team and for executive leadership- serve as a valuable technical resource for lines of business*qualifications:*required qualifications:- bs degree in computer science mathematics or related stem major plus 8 years of relevant experience in software architecture preferably in data-centric systems - curiosity: you ask why you explore you are familiar with latest and greatest open source tools and always fascinated by what’s possible and what could be better - ability to build and maintain resilient interface-oriented architectures– demonstrated experience in defining and implementing such architectures- experience developing architectures with multiple big data technologies (such as elastic search stack (elk) hadoop hbase mongo nifi and kafka); solid understanding of tradeoffs of each and of when they are appropriate to use - experience with rest apis java and python (or another similar interpreted language)- a firm understanding of leading modern development approaches (serverless containerization cloud continuous delivery micro-services event based applications)- experience working in a linux environment such as red hat centos or ubuntupreferred qualifications:- ability to design develop and maintain applications within the cloud environment- familiarity with machine learning and artifical intelligence approaches- familiarity with data analysis or statistics and some data wrangling (programmatically extracting databases transforming data modeling it in readable form etc- experience with graph databases a plus- spark experience- hands-on experience with running configuring and balancing elasticsearch clusters (full lambda architecture cluster experience a plus)- strong oral and written communication and interpersonal skills*leidos overview:*leidos is a global science and technology solutions leader working to solve the world’s toughest challenges in the defense intelligence homeland security civil and health markets the company’s 33 000 employees support vital missions for government and commercial customers headquartered in reston virginia leidos reported pro forma annual revenues of approximately $10 billion for the fiscal year ended january 1 2016 after giving effect to the recently completed combination of leidos with lockheed martin's information systems & global solutions business (is&gs) for more information visit www leidos com the company’s diverse employees support vital missions for government and commercial customers qualified women minorities individuals with disabilities and protected veterans are encouraged to apply leidos will consider qualified applicants with criminal histories for employment in accordance with relevant laws leidos is an equal opportunity employer NA **business title:** big data architect**requisition number:** 32202 - 97**function:** business support services**area of interest:****state:** wa**city:** seattle**description:**known for being a great place to work and build a career kpmg provides audit tax and advisory services for organizations in today's most important industries our growth is driven by delivering real results for our clients it's also enabled by our culture which encourages individual development embraces an inclusive environment rewards innovative excellence and supports our communities with qualities like those it's no wonder we're consistently ranked among the best companies to work for by fortune magazine consulting magazine working mother magazine diversity inc and others if you're as passionate about your future as we are join our team kpmg is currently seeking a big data architect to join our kpmg lighthouse - center of excellence for advanced analytics responsibilities:+ rapidly architect design prototype and implement architectures to tackle the big data and data science needs for a variety of fortune 1000 corporations and other major organizations+ work in cross-disciplinary teams with kpmg industry experts to understand client needs and ingest rich data sources such as social media news internal external documents emails financial data and operational data+ research experiment and utilize leading big data methodologies such as hadoop spark redshift netezza sap hana and microsoft azure+ architect implement and test data processing pipelines and data mining data science algorithms on a variety of hosted settings such as aws azure client technology stacks and kpmg's own clusters+ translate advanced business analytics problems into technical approaches that yield actionable recommendations across multiple diverse domains; communicate results and educate others through design and build of insightful visualizations reports and presentations+ develop skills in business requirement capture and translation hypothesis-driven consulting work stream and project management and client relationship developmentqualifications:+ bachelor's degree from an accredited college university in computer science computer engineering or a related field and minimum seven years of big data experience with multiple programming languages and technologies; or master's degree and a minimum of 5 years of experience; or phd in computer science computer engineering or a related field with minimum three years of big data experience+ fluency in several programming languages such as python scala or java with the ability to pick up new languages and technologies quickly; understanding of cloud and distributed systems principles including load balancing networks scaling in-memory vs disk etc ; and experience with large-scale big data methods such as mapreduce hadoop spark hive impala or storm+ ability to work efficiently under unix linux environment or net with experience with source code management systems like git and svn+ ability to work with team members and clients to assess needs provide assistance and resolve problems using excellent problem-solving skills verbal written communication and the ability to explain technical concepts to business people+ ability to travel up to 80%kpmg llp (the u s member firm of kpmg international) offers a comprehensive compensation and benefits package kpmg is an equal opportunity employer all qualified applicants are considered for employment without regard to race color creed religion age sex gender national origin ancestry citizenship status marital status sexual orientation gender identity or expression disability physical or mental handicap unrelated to ability pregnancy veteran status unfavorable discharge from military service genetic information personal appearance family responsibility matriculation or political affiliation or other legally protected status kpmg maintains a drug-free workplace kpmg will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of applicable local state or federal law (including san francisco ordinance number 131192) no phone calls or agencies please **gl:** 4**gf:** 15304 **description****multiple software engineering and data science positions at experian** experian’s oxygen software framework links behavior data at massive scale to enableconsumers across the globe access financial healthcare and other services oxygen’s high throughput pipelines ingest billions of data events per hour to generate behavioral states and metrics in real time we are looking for software engineers and data scientists who will bring fresh perspectives on high-throughput data ingestion stochastic event linking software-defined state calculation algorithm design the list goes on as a software engineer or data scientist in a small and highlytalented engineering team you will have the opportunity to create products from the groundup and positively impact the lives of people across the globe **knowledge experience & qualifications**we mostly use python and scala to develop for spark and hadoop stacks and are looking for bachelor’s master’s phds with experience on one or more of these experian offers top of the line benefits including health vision and dental insurance flexiblespending and health savings accounts life insurance 401k stock purchase plan bonus maternity and paternity leave etc **description****multiple software engineering and data science positions at experian** experian’s oxygen software framework links behavior data at massive scale to enableconsumers across the globe access financial healthcare and other services oxygen’s high throughput pipelines ingest billions of data events per hour to generate behavioral states and metrics in real time we are looking for software engineers and data scientists who will bring fresh perspectives on high-throughput data ingestion stochastic event linking software-defined state calculation algorithm design the list goes on as a software engineer or data scientist in a small and highlytalented engineering team you will have the opportunity to create products from the groundup and positively impact the lives of people across the globe **knowledge experience & qualifications**we mostly use python and scala to develop for spark and hadoop stacks and are looking for bachelor’s master’s phds with experience on one or more of these experian offers top of the line benefits including health vision and dental insurance flexiblespending and health savings accounts life insurance 401k stock purchase plan bonus maternity and paternity leave etc experian is an equal opportunity employer interview mode- inpersonjob description: create and maintain conceptual logical and physical data models for the onemd data & analytics product line work with business and technical teams to fully understand the business requirements be able analyze data and translate them into data structures coordinate develop and construct data models and data dictionaries ranging from functional-specific to enterprise-wide and from operational to decision-support work with the database administrators to establish naming conventions and optimization standards for the database objects partner with database administration staff in handoff between logical database design and physical database design candidate must have strong modeling skills to support data science tools and help enable self-service capabilities essential job functions:analyze and design a future state environment to address onemd data & analytics product line serving multiple cross functional business organizationslead the design of big data solutionscreate and manage logical physical and conceptual models using data modeling tools develop and communicate standards for data structures build referential integrity into the models investigate data quality issues and provide recommendation & solution to address themwork with offshore development and testing teams to ensure successful implementation of design profile existing data structures to thoroughly understand all data assets that exist in the warehouse and the sources of expansion areas lead and manage department projects and initiatives perform other duties as assigned qualifications: minimum education and experience:bachelor’s degree in computer science information science or related field; or equivalent work experience seven to ten years of experience with data modeling in healthcareexpert in data modeling tools (erwin)significant business focus and experience required provide project-level support to development teams through consulting and data modeling services preferred education additional qualifications and experience: experience with layered data architecture approachexperience with data architecture and design patternsexperience with healthcare big dataexperience with big data technologies – such as aws hadoop python spark scaladata warehouse and reporting database design experience highly desirable formal development methodology and data modeling training preferredrequired knowledge skills and or abilities:excellent customer service interpersonal communication and team collaboration skills are essentialmust be a highly motivated self-starterstrong project planning time management problem solving and prioritization skills are also necessarymust be able to work on multiple simultaneous tasks with limited supervision knowledge of how technical platforms and environments are constructed ability to reverse engineer existing database structuresknowledge of connectivity between environments strong understanding of how applications run on various platforms strong business acumen and political savvyability to lead and manage projectsability to collaborate while dealing with complex situationsability to think creatively and to drive innovationability to motivate lead and inspire a diverse group to a common goal solution with multiple stakeholdersability to convert business strategy into action oriented objectives and measurable resultsability to investigate data related issues recommend & implement data quality frameworkstrong negotiating influencing and consensus-building skills rajeev selvaraj(732)-619-8646rajeev@eitprofessionals com - provided by dice data modeler data analysis power designer erwin er studio who is advisorengine:we believe that the future of financial advice is personal scientific and beautiful – these three ideals drive everything that we do advisorengine technology creates a unified experience across financial advisors clients and business management personnel our journey is just beginning but we’ve already started turning our vision into reality we have built the industry’s most advanced wealth management platform using smart automation; added data integrations with other leading technology companies; developed strong client relationships; completed four acquisitions; and raised over $55 million in investor capital to help fuel our future growth our team is made up of designers enterprise technologists data scientists futurists and business builders we are based in tribeca new york with offices in atlanta georgia and raleigh north carolina if you are driven to create the future of financial advice we’d love to hear from you job description:advisorengine is looking for a contract to hire sql developer with postgres experience only candidates interested in transitioning to a permanent position should apply this person will work with an existing sql developer support and developing the databases for our cloud based crm application as well as supporting a legacy desktop application both of which are based on sql the sql developer will also work on an effort to move our self hosted sql to aws and then evaluating and determining the viability of a conversion from sql to postgres the person must be a quick learner and thrive in a fast-paced environment that requires strong multitasking and communication skills responsibilities will include: · updating the production sql server configuration and updates · implementing database strategy for backup and recovery within our primary environment and to our disaster recovery environment · perform database and sql tuning for existing and new sql access · contribute to the security plans for data at rest · tuning the application databases including index and performance analysis to provide optimal performance · implement database monitoring and reporting for our production environment · document the existing data model design model updates for new development requirements and maintain model documentation · contribute to establishing qa data methodologies to allow repeatable testing on sql databases including performance load and regression testing · review the existing sql implementation and develop an upgraded sql model to support existing and future development · work with developers to provide oversight and development on the sql code · write and optimize sql code · assist with ssis reporting that supports migration from application to another · assist with improving the application data analytics capabilities · participating in an agile development environment including sizing estimating collaborative development and faced paced iterative delivery · evaluate and implement a move from a self-hosted to an aws environment · participate in the evaluation of the viability of transitioning from sql to postgres for all or portions of our application · some afterhours work will be required for deployments and support what you have:required· bachelor’s degree in computer science or equivalent experience demonstrating advanced level of sql and dba knowledge and implementation history · demonstrated ability to multi-task and succeed in a fast paced dynamic environment · strong analytical ability to quickly debug application problems and provide short & long term solutions · 3+ years of working with the net framework · 5+ years of developing and maintaining sql code for web based applications · 5+ years of sql server 2008 2012 2016 development experience with writing complex queries stored procedures performance tuning · 3+ years with database modeling designing tables relationships etc · demonstrated history dealing with sql and database performance scalability and maintainability · experience establishing db restoration plans and implementation to achieve rpo rto objectives · experience with clustering log shipping and other sql practices and determining the appropriate technology to meet the required needs · experience with sql and db tooling to determine server reliability and performance including extensive experience with the profiler · strong written and verbal communications teamwork and problem-solving skills · ability to interact and communicate successfully with business partners and technology teams · experience with agile development practices · self-starter who can grasp difficult concepts · ability to think outside the box and come up with creative solutions when tools don’t work desired· experience with aws · experience with postgres · experience with hadoop · experience working with ssis packages · experience with data warehousing · experience developing with c# javascript and scripting languages like powershell · experience with financial services or brokerage industry · experience working with resources in different geographical locations vdart is a it staffing firm based out of atlanta ga specializing in digital & emerging technologies founded in 2007 vdart has over 1700+ employees and contractors spread across 3 continents we specialize in providing the fortune 1000 companies niche hard to find skills in technologies including social mobile big data analytics cloud machine learning and artificial intelligence with delivery centers in uk mexico canada and india we provide talent solutions to global customers covering emea apac & americas we provide deep technology and domain expertise in bfsi energy & utility technology cpg & retail industry verticals vdart is an award winning organization recognized on inc 500 hall of fame; atlanta business chronicle’s fastest growing companies; nmsdc’s national supplier of the year; ernst & young’s regional entrepreneur of the year and more job description:position: big data architect with life science domain location: east hanover njcontractmandatory skills: "bigdata" "spark" "life science" "data ingestion" "cloudera"big data & hadoop backgroundcloudera experience perhaps a lot of data ingestion using sparkls exp is a big pluskey words: bigdata" "spark" "life science"if your skills match our requirements please send your resume to recruiter@vdartinc com for immediate consideration please be assured that your resume will be reviewed and you will be contacted if there is an interest in your background and experience referral program: ask our recruiting team about how you can be a part of our referral program if you refer a candidate with the desired qualifications and your candidate accepts the role you can earn a generous referral fee we want to hire the best talent available and are committed to building great teams and partnerships we are equal employment opportunity employer vdart incalpharetta garecruiter@vdartinc comfollow us on twitter for the hottest positions: @vdart_jobsfollow us on twitter: @vdartinc - provided by dice "bigdata" "spark" "life science" "data ingestion" "cloudera""bigdata" "spark" "life science" "data ingestion" "cloudera""bigdata" "spark" "life science" "data ingestion" "cloudera""bigdata" "spark **description**the advanced technologies group is responsible for leveraging new and emerging open source technologies to solve key technical challenges for our clients as well as integrating new data sets products and applications acquired by experian as a big data lead on our team you will be responsible for applying your big data skills and deep knowledge of enterprise application development to create seamless solutions that integrate our wide array of data and tools into comprehensive scalable solutions **key responsibilities:**+ work in a collaborative manner with our scaled agile (safe) teams to rapidly deliver solutions+ understand technical and business requirements and translate them into technical implementations+ design and implement high-performance scalable data solutions+ direct the work of others+ provide best in class security in everything you do**knowledge experience & qualifications**+ bs degree in computer science computer engineering or equivalent+ 5+ years’ experience delivering enterprise software solutions in java scala or python+ hands on expertise with big data hadoop and the hadoop ecosystem including hive spark yarn+ experience with linux and linux scripting languages+ experience building solutions using continuous integration and automated testing+ experience leading or mentoring others+ experience with agile development methodologies+ solid understanding of secure application development methodologies+ strong desire to constantly learn new skills**pluses:**+ experience writing cloud deployed applications using amazon web services+ experience with other nosql technologies such as hbase mongodb cassandra etc + experience with devops+ experience working with data scientists or other analytical users**benefits:**working for a leading ftse 50 global information services company is just one of the rewards of joining experian our benefits package is designed to reward contribution and loyalty and to attract the kind of talented individuals who have their pick of employers that is why we offer a highly competitive package which comprises:+ competitive base salary+ aggressive bonus plan+ core benefits including: full medical dental vision matching 401k and the opportunity to work with a global leader**experian is listed on the london stock exchange (expn) and is a constituent of the ftse 100 index total revenue for the year ended march 31 2016 was $4 8 billion experian employs approximately 17 000 people in over 40 countries **experian is an equal opportunity employer anyone needing accommodation to complete the interview process should notify the talent acquisition partner the word "experian" is a registered trademark in the eu and other countries and is owned by experian ltd and or its associated companies **description**the advanced technologies group is responsible for leveraging new and emerging open source technologies to solve key technical challenges for our clients as well as integrating new data sets products and applications acquired by experian as a big data lead on our team you will be responsible for applying your big data skills and deep knowledge of enterprise application development to create seamless solutions that integrate our wide array of data and tools into comprehensive scalable solutions **key responsibilities:**+ work in a collaborative manner with our scaled agile (safe) teams to rapidly deliver solutions+ understand technical and business requirements and translate them into technical implementations+ design and implement high-performance scalable data solutions+ direct the work of others+ provide best in class security in everything you do**knowledge experience & qualifications**+ bs degree in computer science computer engineering or equivalent+ 5+ years’ experience delivering enterprise software solutions in java scala or python+ hands on expertise with big data hadoop and the hadoop ecosystem including hive spark yarn+ experience with linux and linux scripting languages+ experience building solutions using continuous integration and automated testing+ experience leading or mentoring others+ experience with agile development methodologies+ solid understanding of secure application development methodologies+ strong desire to constantly learn new skills**pluses:**+ experience writing cloud deployed applications using amazon web services+ experience with other nosql technologies such as hbase mongodb cassandra etc + experience with devops+ experience working with data scientists or other analytical users**benefits:**working for a leading ftse 50 global information services company is just one of the rewards of joining experian our benefits package is designed to reward contribution and loyalty and to attract the kind of talented individuals who have their pick of employers that is why we offer a highly competitive package which comprises:+ competitive base salary+ aggressive bonus plan+ core benefits including: full medical dental vision matching 401k and the opportunity to work with a global leader**experian is listed on the london stock exchange (expn) and is a constituent of the ftse 100 index total revenue for the year ended march 31 2016 was $4 8 billion experian employs approximately 17 000 people in over 40 countries **experian is an equal opportunity employer anyone needing accommodation to complete the interview process should notify the talent acquisition partner the word "experian" is a registered trademark in the eu and other countries and is owned by experian ltd and or its associated companies experian is an equal opportunity employer tista science and technology corporation a cmmi maturity level 3 company focuses on delivering information technology and professional services to federal and state agencies tista is an inc 500 company a recipient of the 2010 top 100 service-disabled veteran-owned businesses from diversity business recognized in washington technology's fast 50 list of the fastest growing small businesses in government contracting in 2012 & 2013 recognized as the top 25 fastest growing small technology companies by the washington business journal in 2014 & 2015 and selected as the veteran owned company of the year in 2014 by the montgomery county md dept of economic development the sr data engineer's mission will be to guide this platform’s development so that it becomes the engine of a healthcare-wide transition to value-based care by enabling effective data sharing analytics performance transparency and value-based payment excepted to provide hands on software development support for a large data warehouse project hosted in a cloud environment provide architectural guidance and oversight to projects within a team be a mentor to junior and mid-level engineers enforce high-quality coding standards and practices via reviews and by demonstrating this in their own work be able to help others break down large team goals into specific and manageable tasks be involved and supportive of agile sprint model of development helping to enforce the practice and the discipline able to work efficiently and proactively across engineering teams to enable us to deliver on our goals of loosely coupled adaptable scalable solutions have a good understanding of where their project fits into the larger goals for engineering and adapts their work so that the priorities of the systems they are creating match those of the organization demonstrated expertise with one or more high-level software languages (e g java java script scala python ruby on rails)experience software development in a team environmentexperience with one or more software version control systems (e g git subversion)experience software development on a team using agile methodologyfamiliarity with devops tools and techniques (e g continuous integration jenkins puppet etc)experience with cloud infrastructure (e g amazon web services)experience with big data processing frameworks (e g spark)preferred experience with big data tools (e g databricks) education:advanced degree (ms phd) in quantitative or policy-related field preferredclearance:must be able to obtain a public trustlocation:rockville md here at tista science and technology we value our veterans and encourage all to apply! tista is an equal opportunity affirmative action employer all qualified applicants will receive consideration for employment without regard to race color religion sex national origin disability or protected veteran status **big data architect****cgi-experience the commitment**find similar career opportunitiescategory:analytics and emerging digital technologiescity:new york new york united statesposition id:j0318-0429employment type:full time**meet our professionals****cgi: committed to helping its professionals and clients****position description:**cgi is seeking a big data architect to help a strategic financial services customer build a next gen surveillance model for risk mitigation in this role you will apply broad knowledge of current machine learning and statistical modeling methods and an understanding of how such methods are applied to compliance specifically in the financial services industry you will work in a complex multi-functional cross-platform domain and will apply expertise in data science this is an exciting opportunity to work on a high-profile project in a high impact part of a global bank **your future duties and responsibilities:**? experience in manipulating large datasets? understanding of trader and employee compliance in the financial services industry ? demonstrated experience in leading manipulation of large datasets to successful conclusions? strong programming skills (such as hadoop mapreduce or other big data frameworks)? experience with statistical modeling (like jupyter python or r)? relevant experience working in the financial services industry? proficiency in statistical analysis quantitative analytics forecasting predictive analytics? creating automated anomaly detection systems and constant tracking of its performance? identifies what data is available and relevant including internal and external data sources leveraging new data collection processes such as smart meters? experienced in effectively communicating with and positively influencing diverse stakeholders and team members? good oral and written communication skills? structured in approach and well-organized? works with it teams to support data collection integration and retention requirements based on the input collected with the business **required qualifications to be successful in this role:**? 3-5 years of relevant quantitative and qualitative research and analytics experience ? 3+ years of experience in building machine models in r python or sas using techniques such as random forest ann svm logistic regression ? hands-on expertise with sql databases such as oracle is required ? knowledge of implementing streaming models (dynamic incremental models) such as streaming k-means and streaming random forest is desired ? ability to take ownership work under pressure and meet deadlines on time? team player willing to work in an international environment and be part of a global team? works effectively in a dynamic environment with changing priorities? demonstrated experience working in a complex global environment with team members who are located in multiple physical locations and across time zones ? exceptional analytical skills ? demonstrated ability to work in agile environment ? 3-5 years quantitative and qualitative research and analytics experience? 3+ years 3+ years of experience in building machine models in r python or sas using techniques such as random forest ann svm logistic regressioneducation:? bachelor?s degree in mathematics statistics computer science operations research physics or other quantitative discipline like financial engineering **what you can expect from us:****build your career with us **it is an extraordinary time to be in business as digital transformation continues to accelerate cgi is at the center of this change?supporting our clients? digital journeys and offering our professionals exciting career opportunities at cgi our success comes from the talent and commitment of our professionals as one team we share the challenges and rewards that come from growing our company which reinforces our culture of ownership all of our professionals benefit from the value we collectively create be part of building one of the largest independent technology and business services firms in the world learn more about cgi at www cgi com no unsolicited agency referrals please cgi is an equal opportunity employer qualified applicants will receive consideration for employment without regard to their race ethnicity ancestry color sex religion creed age national origin citizenship status disability medical condition military and veteran status marital status sexual orientation or perceived sexual orientation gender gender identity and gender expression familial status political affiliation genetic information or any other legally protected status or characteristics cgi provides reasonable accommodations to qualified individuals with disabilities if you need an accommodation to apply for a job in the u s please email the cgi u s employment compliance mailbox at us_employment_compliance@cgi com you will need to reference the requisition number of the position in which you are interested your message will be routed to the appropriate recruiter who will assist you **please note this email address is only to be used for those individuals who need an accommodation to apply for a job emails for any other reason or those that do not include a requisition number will not be returned** we make it easy to translate military experience and skills! click here at https: cgi-veterans jobs to be directed to our site that is dedicated to veterans and transitioning service members all cgi offers of employment in the u s are contingent upon the ability to successfully complete a background investigation background investigation components can vary dependent upon specific assignment and or level of us government security clearance held cgi will not discharge or in any other manner discriminate against employees or applicants because they have inquired about discussed or disclosed their own pay or the pay of another employee or applicant however employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information unless the disclosure is (a) in response to a formal complaint or charge (b) in furtherance of an investigation proceeding hearing or action including an investigation conducted by the employer or (c) consistent with cgi?s legal duty to furnish information **skills:**+ analytical thinking+ analytics & emerging dig tech+ communication (oral written)+ hadoop hive+ investmentbanks&securitiesdeal+ python+ quantitative user research+ r programming language+ sas+ software architecture+ solution analysis+ sqlhave you been referred by a cgi member for this position?* **yes** **no****cgi member?s email (@cgi com)** *******first name****last name****i agree to allow cgi to contact the cgi member who referred me to obtain additional information about my application and to confirm my referral ** ******** this field is required **cgi is committed to the principles of equal employment opportunity and to compliance with us laws and regulations click here at http: documents njoyn com homebase cgi amsdocs eeo_policy_-_signed_2017 pdf to access our us eeo affirmative action policy applicants have rights under federal employment laws:1 eppa at https: www dol gov whd regs compliance posters eppac pdf2 fmla at https: www dol gov whd regs compliance posters fmlaen pdf3 consolidated eeo is the law at http: documents njoyn com homebase cgi amsdocs eeo_poster pdfcgi provides reasonable accommodations to qualified individuals with disabilities if you need an accommodation to apply for a job in the u s please email the cgi u s employment compliance mailbox at us_employment_compliance@cgi com you will need to reference the requisition number of the position in which you are interested your message will be routed to the appropriate recruiter who will assist you **please note this email address is only to be used for those individuals who need an accommodation to apply for a job emails for any other reason or those that do not include a requisition number will not be returned **we make it easy to translate military experience and skills! click here at http: cgi-veterans jobs to be directed to our site that is dedicated to veterans and transitioning service members ab initio pos data analystone of our clients a global market research company is looking for a talented ab initio pos data analystpermanent position with excellent compensation package and benefits location: metro ny area** must be authorized to work for any employer in usno h1 candidates can be considered for this roleplease read the description below and to be considered immediately email your resume to barryr@brainsworkgroup com with your rate salary history requirements ab initio pos analystresponsibilities:ensure timely deadlines are met for onboarding new retailersprovide concise and detailed analysis on pos dataability to read write scripts in unix linuxdevelop retailer feeds within abinitio ace systemmust have ace experience pre-validate the data by running analysis through toolsensures proper communication and hand-off to operation production teamwork on special projects such as data mining for escalated production issues and document related changes to the on boarding process analyze and develop various retailer’s raw datacoordinate with the global retail engagement team and other internal departments including product global operations teams and it qualifications:5+ years of experience in developing architecting solutions in retail sector with knowledge of etl data warehousing technologies including ab initioproven development experience in ab initio gde with exposure to ace control centeradditional etl experience similar to informatica pl sqlexperience with databases oracle (preferred) and unixexposure to multiple diverse technical configurations technologies and processing environmentsexcellent analytical and technical skillsother desired skills: data collection configuration and analysis methodologies tools such as ab initio sas sql apiability to work on multiple priorities and or projects simultaneously in a highly collaborative environment with minimal supervisionbachelor's degree in computer science engineering or related fielduse this link to apply directly:https: brainsworkgroup catsone com careers index php?m=portal&a=details&joborderid=10250383or email: barryr@brainsworkgroup comcheck all our jobs: http: brainsworkgroup catsone com careers roles and responsibilitiesstandards it is seeking an analytics administrator in the oklahoma city area the successful candidate will design develop and implement systems to ensure customer requirements are met in an agile and cost-effective manner the candidate will have both strong technical and personal skills with the ability to work with multiple clients on a weekly basis essential functions:to succeed in this role as an analytics administrator you will need the following:extensive experience in relational database management systems and data warehousesextensive experience building etl processes to convert operational transactional data into dimensions and fact tablesexperienced in an agile project management environmentexperienced with the microsoft business intelligence stacksql – ssms ssrs ssis ssas powerbi extensive t-sqlsharepoint – public and team sites workflow automation integration powerbioffice suite – excel powerbi etl data modeling and visualizationvisual studio – team foundation services net developmentexperience in developing business intelligence solutionsexperience in data science solutionsexperience in managing and monitoring system state metricsexperience in developing process performance metricsexperience working in a medical environment is preferredexperience in vb net c# development is preferredexperience building etl processes to convert operational transactional data into dimensions and fact tables is preferredadditional responsibilities:design and develop business intelligence solutions using multidimensional cubes including molap and rolapdevelop business intelligence solutions including reports dashboards kpi’s and scorecardsanalyze user applications and develop specifications that can be presented to non-technical users organize and prepare program and system documentation according to established procedures to facilitate ongoing support and maintenancecoordinate system implementation by planning and monitoring system testing and conversion to ensure a smooth conversion to new programsqualifications 5 years data analysis and business intelligence experience required 3 years of experience with etl technologies preferred1-2 years of healthcare data background preferred spotify is looking for an exceptional individual with strong data science and strategy consulting background to work closely with our finance and legal teams in new york city to drive cross-functional data initiatives your work will inform the legal team’s strategy across cutting-edge digital legal issues including copyright and privacy and you will also be deeply involved in designing and deploying user lifetime-value related experiments to improve campaign effectiveness and user experience you will have the rare opportunity to dive deeply into data around all corners of the company in this role and above all you will be at the nexus of data science and cutting edge data issues at one of the most innovative companies in the world what you’ll dounderstand the strategy priorities and pain points of our finance and legal team and bring that deep understanding to the data science and data engineering community at spotifymaximize the cross-functional impact of our research and data platforms and partner with functional leads to identify and evaluate technology investment opportunities that increase the scale and efficiency of user communication and marketing campaigns conceive of new features or components to existing data projects or platforms that would advance the work of our finance and legal team design new solutions (spanning research platforms culture) to advance the work of the finance and legal — consult with other solutions leads to explore the potential for common solutionsbe a thought leader within the legal team as the team implements policies and processes governing the company’s compliance with data privacy regulations draw on best practices learned from interaction with other companies scholarship conferences etc communicate what we solve and build in data to the function ensuring that outputs land successfully and achieve their intended impact follow that impact and share findings with data squads to inform our ‘tweak it’ phases who you areminimum 5 years of experience in experiment design legal analytics financial modeling or strategy consulting roles at fortune 500 companies or top consulting firms willingness to learn relevant legal concepts as underpinnings of analysis a must extensive experience manipulating and analyzing complex data with sql and other tools such as python familiarity with google bigquery is a plus comfort operating independently in a fast-paced work environment experience in interfacing with senior business leaders across multiple time zones a minimum of 3 years of experience working cross-functionally with engineers researchers data scientists and business stakeholders we are proud to foster a workplace free from discrimination we strongly believe that diversity of experience perspectives and background will lead to a better environment for our employees and a better product for our users and our creators this is something we value deeply and we encourage everyone to come be a part of changing the way the world listens to music first san francisco partners is a business advisory and enterprise information management (eim) consultancy dedicated to helping companies leverage their data to improve strategic decision making reduce risk create operational efficiencies and fuel unprecedented business success our services span data governance data quality strategies data management architecture master data management strategy and implementation analytics and big data job responsibilities and duties:we have an immediate opening for a reference data consultant the consultant is expected to establish enterprise-level processes for managing reference data which is otherwise known as code tables domains or look-ups these duties will be carried out jointly with a technical team that is building the infrastructure for a golden copy of reference data duties will include:develop gain approval for and implement processes to:discover and select external standards for reference data profile and acquire external standardsmanage subscriptions both paid and unpaid to external reference on boarding and ingesting external reference datacustomize externally acquired reference data ensure externally acquired reference data is kept up to date identify internal reference data sets and their stakeholders govern internal reference data setsensure managing changes to internal reference data setsdistribution of reference dataensure the quality of reference data is high in systems that use it (e g periodic reconciliation to the golden copy)manage semantics of reference data (particularly definitions and instructions for use)manage errors and issues with reference datamonitor use of reference data to provide statistics and metrics to senior managementengage with work with both business and it stakeholder skills and qualifications:5-7 years of experience working with reference dataexcellent communication skills presentation and interpersonal skills are required a demonstrated track record of making a difference and adding value strong organizational skills with the ability to multi-task ability to think creatively highly-driven and self-motivated ability to work and adjust to changing deadlines creative problem-solving skills must be able to develop relationships across the organization working cross functionally to get results ability to present complex information in a simplified fashion to facilitate understanding bachelor's degree in business administration computer science cis or related field shiftgig one of chicago’s hottest and fastest growing technology companies is actively seeking a data architect in our chicago office we are a tech company in a non-tech space and as such we have a lot of interesting problems to solve we solve many of these using the large and diverse data set we have as such you will be a key member of our team currently we have a team of three data scientists that are in need of technology support and direction in terms of pipelining data from both the shiftgig product as well as supporting back-office platforms from sales accounting and recruiting today this platform is a collection of saas-based tools that are performing various integrations etl processes and data science algorithm hosting we currently lack a dedicated data engineering team member and we are looking for a hands-on software engineer who is ready to dig in and evaluate the current state and architect and deliver sustainable solutions to our data science team and work with our product engineering team to create a cohesive real-time data analytics and insights platform key upcoming projects are enabling our marketplace to have a data science driven algorithmic matching process between our workers and our customers that maximizes successful work outcomes in our marketplace the direction of our matching and ratings will eventually drive experiments around dynamic pricing upskilling opportunities and recruiting efficiency 3-5 years developing high quality production softwarepast experience in building high-quality software with an understanding of coding patterns practices and testingdemonstrated experience in modeling and architecting scalable data systemsexperience working with various databases such as:postgresqlsql servermongodbcouchbasecassandraredisunderstand big data platforms warehouses relational nosql and non-relational stores and understand when best to use each of themorganize macro level processes and systemshave experience building non-trivial etl solutions that scale extracting data from operational product and business systemsdelivered business intelligence capabilities via tools like domo tableauproven ability to take business needs and deliver data informed solutionsexperience with agile development processesexcellent written and verbal communication skills desired qualifications:3+ years building aspects of a data pipeline and warehouseworking with support teams to enhance testing practices for the futureexperience in using one or more of core aws data technologies such as: redshifthadoopkafkagluerdsexperience programming in one or more of our current core languages: python net javascript java objective-c kotlin and swiftexpertise in agile software development processfamiliarity with common security concerns in web and mobile applicationsperformance and load testing experienceautomated security tools knowledgeworking in a microservice-oriented architecture sg123shiftgig was founded on the simple premise that many people want flexible work opportunities that fit into the rest of their lives so we build technology that is focused on one thing: connecting people who want temporary work right now with businesses who need them we’re fulfilling our mission of connecting millions of people with millions of shifts via our mobile apps and platform our apps make it easy for businesses to post gigs and for qualified and skilled workers to claim them our platform handles shift fulfillment and handles all the messy bits associated with labor management we are a tech company in a non-tech space with many interesting problems to solve our architecture is centered around internal and external apis where the consuming clients are our single page applications and native mobile apps key accountabilities core job responsibilities: lead participate in the it development of business intelligence solutions to meet the needs of the commercial organization (sales sales operations market access marketing)carry out data modeling including evolution from conceptual to physical models using a tool like erwindefine data profiling and analysis criteria and development of data quality metricsdefine and implement error and exception handling strategiesestablish criteria for integration & user testinghandle change management release management and source code control practiceslead multi-component cross-functional programs of high-complexityestablish enhance and communicate technical guidelines and best practices for the application and integration development teamself-starter ‘can-do’ attitude a must in fast-moving business environment and willing to make a difference!desired skills10+ years of experience and proven capabilities in the application architecture development environment as edw architect leadmandatory: at least 2-4 years of hands-on functional and technical experience as an architect on building out commercial edw platformknowledge of data warehousing principles master data management and data governance strong knowledge of etl sql & pl sql skills and experience working with relational databases (oracle sql server)strong knowledge of processes & data related to sales sales operations marketing market access etc (e g direct sales self-dispensing pharmacy competitor sales prescription shipment distribution channel crm incentive program physician affiliations pricing decisions payer contracts speaker program etc ) strong knowledge of applications related to commercial organization (e g ims spp sd direct sales self-dispensing pharmacy shipment veeva speaker program sas etc ) bachelor degree in computer science or equivalent technical experience extensive business and or technical background in the areas of data modeling edw data mart master data management meta data management data quality data governance data integration (etl) and data securityexperience with data implementation methodologies and best practices (e g waterfall agile other) big data engineer - hadoop**job details:**+ location:matthews nc+ salary:$52 - $60 per hour+ date posted:wednesday march 14 2018+ job type:temp to perm+ industry:information technology+ reference:608098**job description****big data engineer - hadoop****job description:**this position of senior data engineer is responsible for designing and maintaining highly scalable data management systems building high-performance algorithms prototypes and predictive models as well as creating customer software components by employing a variety of programming languages and off-the-shelf tools to marry systems together in his her role the data engineer will work closely with data architects to determine what data management systems are appropriate and with data scientists to determine which data is needed for analysis this position will work with stakeholders to understand their information needs and translate these into technical solutions using such tools as spark hbase hive nifi kafka sqoop python r and etc this position will work intimately with existing team of data science staff to take existing or new prototype models and convert them into scalable analytical solutions **more specifically responsibilities include: -** translating complex functional and technical requirements into detailed architecture design and high performing software -leading big data analytical solutions leveraging transformational technologies -driving user ideation analysis and elaboration design and development of software applications testing and building automation tools -selecting data solution software and defining hardware requirements -translating business requirements into system requirements -building next-generation big data analytics framework developed on a group of core technologies -coding testing and documenting new or modified data systems to create robust and scalable applications for data analytics -ensuring all automated processes preserve data by managing the alignment of data availability and integration processes -collaborating with data architects modelers and it team members on project goals -assisting with extracting standard sets of features from important datasets that will be leveraged by data science team -converting code from applications in r python or other languages into spark applications **required qualifications**-master's degree in computer science or engineering with total 10+ years' relevant experience -5+ years of experience in data management and or data engineering- 4+ years of experience in development in big data environments (hadoop hive spark) -hands-on coding skills in spark (1 6 and 2 0) r python java and or c#-fundamental understanding of rdbms and extensive experience using etl tools-collaborations skills working with data architects modelers and other it team members**desired qualifications**-ph d degree in computer science engineering or management information systems -extensive experience in large scale system implementation and project management -mastery of machine learning algorithms and advanced mathematics -excellent verbal and written communication skills as well as the ability to bridge gaps between data science and business management -experience with sap hana definitely a plus principal big data engineer - nyc - $200k-30% bonus my client is looking for a senior big data engineer asap! principal big data engineer - nyc - $200k-30% bonus role & responsibilities:provide highly technical and strong leadership to a teamuse understanding of big data concepts to fully design and communicate project goalsbe able to communicate technically and non-technically to higher managementparticipate in full software development life cycle (sdlc) of the data & analytics solutionparticipating and leading client engagementassess usage and utilization trends to make appropriate recommendations towards scaling our cloud environments manage infrastructure costs in cloud and hybrid environments perform scripting and create future proof cloud designs and implementation required experience & qualifications:experience working with big data hadoop technologies (spark pig hive etc ) proven experience having worked on azure and or aws cloud platform in production experience with azure stream analytics data lake data factory sql data warehouse etc experience with aws redshift emr ec2 s3 is a plus experience with java scala and python benefits & bonuscompetitive base and bonusfull medical health insuranceopportunity to work with the newest technology in the industryremote capabilitiesfree lunches everydaymetro passzero travel requirements if you or someone you know is interested in this position please send your resume directly to e styles@frgconsulting com or call 212-731-8282 my client is looking to start the interview process as soon as possible frg consulting is a leader in niche it recruitment with a focus on cloud technologies including azure & aws as well as big data data science and bi we deal with both microsoft partners aws partners & end users throughout north america we have open positions and relationships with some of the top partners and end users throughout the us and offer some excellent opportunities in the bi big data space i understand the need for discretion and would welcome the opportunity to speak to any big data and cloud analytics candidates that are considering a new career or job either now or in the future confidentiality is of the upmost importance for more information on available bi jobs as well as the business intelligence big data and cloud market i can be contacted at 212-731-8282 please see www frgconsulting com for more information big data data science hadoop microsoft business intelligence bi business intelligence ssrs ssas ssis sql t-sql mdx azure cloud aws data warehouse etl power bi architect big data hadoop scala python apache hive spark ms azure amazon web services aws emr rds redshift s3 ec2 lambda glue data pipeline kafka power bi data lake data lake analytics azure iot hubs blob storage hdinsight data factory machine learning sql data warehouse steam analytics ruby dynamodb kinesis - provided by dice big data data science hadoop microsoft business intelligence bi business intelligence ssrs ssas ssis sql t-sql mdx azure cloud aws data warehouse etl power bi architect big data hadoop scala python apache cures start here at fred hutchinson cancer research center home to three nobel laureates interdisciplinary teams of world-renowned scientists seek new and innovative ways to prevent diagnose and treat cancer hiv aids and other life-threatening diseases fred hutch’s pioneering work in bone marrow transplantation led to the development of immunotherapy which harnesses the power of the immune system to treat cancer an independent nonprofit research institute based in seattle fred hutch houses the nation’s first cancer prevention research program as well as the clinical coordinating center of the women’s health initiative and the international headquarters of the hiv vaccine trials network careers start here a highly motivated bioinformatician is sought to support research in the lab of dr finak at fred hutchinson cancer research center in seattle wa about the labwe are part of the rglab the biostatistics bioinformatics and epidemiology program and the vaccine and infectious disease division of fred hutch in seattle we develop statistical methods and software tools for the analysis of high throughput biological data with an emphasis on immunology and vaccine research we work with bench scientists and clinicians to understand and ultimately help develop vaccines and or cure severe diseases such hiv malaria and cancer we are a diverse group with training in statistics computer science web development bioengineering bioinformatics and computational biology learn more about us at rglab org and see some of our work on github (github com rglab) dr finak is seeking a bioinformatician sofware engineer to help build the next generation of computational flow cytometry tools under the flow cytometry data standards integration and analysis r01 award the successful candidate will help develop implement and optimize computational flow cytometry software tools and analytic pipelines for large clinical and non-clinical flow and mass cytometry data sets leveraging r and c++ we are building open source tools to enable analysis integration and sharing of large flow cytometry data sets with a focus on facilitating reproducible research the candidate will work as part of a team to support data analysis and software development and international collaborations the candidate is expected to adhere to good software development practices (e g design unit tests documentation code review) and participate in regular team meetings to be considered candidates must have:msc in bioinformatics data science computer science software engineering or equivalent field with 1+ years related experience bsc with 4+ years’ experience also qualifies r (expert) and c++ (proficient) web stack aws cloud computing (working knowledge)unix-like operating system environment (proficient)r tidyverse tools (proficient)r package development (familiar);software version control (e g git working knowledge or greater)candidate should possess: strong work ethic and analytical skills an ability to work in a team to meet deadlines and to deliver a robust well-tested and well-documented quality codebasepreferred qualificationstrack record of contributions to open source software projects excellent written and oral communication skills to apply:please apply with your cv and a letter summarizing previous work experience we are committed to cultivating a workplace in which diverse perspectives and experiences are welcomed and respected we are proud to be an equal opportunity and vevraa employer we do not discriminate on the basis of race color religion creed ancestry national origin sex age disability marital or veteran status sexual orientation gender identity political ideology or membership in any other legally protected class we are an affirmative action employer we encourage individuals with diverse backgrounds to apply and desire priority referrals of protected veterans if due to a disability you need assistance and or a reasonable accommodation during the application or recruiting process please send a request to our employee services center at escmail@fredhutch org or by calling 206-667-4700 title: big data developerlocation: philadelphia patype: contract partner’s consulting recruits for positions that allow our consultants to make a positive impact and opportunity for growth at exciting high-level organizations our consultants enjoy flexibility competitive pay and the ability to be part of an awesome award-winning company with a progressive work culture descriptionour client is seeking a big data developer who will work on their product analytics and behavior science team that is responsible for nearly all of the product web app and stb datasets used at their company for analyzing the product experience they are responsible for building datasets for the purpose of analysis and collaborating with modeling experts to build production data pipelines as a member of this team you would be at the center of product innovation as nearly every team member depends on this data key accountabilities sample projects:builds datasets in partnership with data scientists creates source datasets for a b testing creates data streaming applications for machine learning applications maintains prediction apis establishes new patterns for efficient processing of 100's tb datasets defines metrics for tracking how customers are interacting with products and services partners with ui engineering teams to enable new customer experiences partners with customer experience teams to create an improved experience and be the voice for customer leads data driven product development required skills:background in a quantitative technical field extensive experience working with large data sets and will have some experience in data-driven decision making at least 5 years of experience in a data-driven environment designing and building distributed data processing systems related industry experience (cable telecommunications etc ) hands on experience in developing big data pipelines end to end proficiency with programming in python java and or scala you strive to write beautiful code and you're comfortable working in a variety of tech stacks self-motivated ambitious and quick to take action while also open to new ideas stays current with technology and continually strives to be better at your craft must have aws experience developing data streaming pipeline understanding of spark (emr databricks and or apache) on ec2 s3 as object store kafka kinesis presto athena (druid) experience as a programmer in python java and or scala good communication skills partner's consulting is a reputable nationally certified women owned business enterprise information technology consulting and recruiting firm based in the philadelphia area we have built personal relationships with our clients over the years and work directly with hiring managers on all our open positions our core values are focused on the highest level of professional dedication to our client's requirements coupled with our desire to partner for joint success if you have any questions or concerns about the posting above please contact us at recruiter@partners-consulting com 41618-1 - provided by dice big data developer big data developer product analytics behavior science data sets product experience cable telecommunications data programming python java code spark have you ever been frustrated by the home buying process as you're searching for your next home condo or apartment and think that technology could improve the experience?if you have used the edina realty themlsonline com re max results or homespotter app you are familiar with our work homespotter…is a funded startup with significant continued growth and recognized successis a thought leader in mobile-first real estate search and collaborationis growing our team through 2018 and beyonddata engineering the data engineering team at homespotter is responsible for ensuring the correctness and availability of the company's data and for the creation and maintenance of systems that read write and manipulate that data as part of the data engineering team you'll leverage a mix of cutting edge and legacy tools to solve some of the most fascinating and impactful problems in real estate advertising technology this position is a core engineering role helping us build out scale up our etl pipeline and data warehouse data engineers at homespotter spend quite a bit of time creating and shipping code administering databases and collaborating with other teams to help us improve our engineering practices you…5+ years of large-scale dba experience with mysql including distributed system design performance tuning debugging gathering metrics security and disaster planning recoveryare fluent in at least one programming language preferably python java or phpcan read write and refactor highly performant database queries and codehave excellent communication skillshave strong architecture and system design skillshave a pragmatic mindset; are passionate about iterative experimentationare a persistent and tenacious problem solverwill participate in pager duty rotationadditional awesomeness experience with real estate technology e g rets feedsspark storm hadoop kafka etcgo clojure perl rlarge datasetsa b testingdata science statistical modeling machine learning for classification and regressionmore about homespotter…homespotter makes buying and selling homes delightful founded in 2009 homespotter is an award winning established funded startup that has experienced substantial growth homespotter is located at the minneapolis grain exchange in downtown minneapolis and has a growing base of clients across north america servicing over 250 000 agents finding talented engineers like you is an ongoing endeavor throughout 2018 and beyond so if you are not available now let's have a conversation anyway! working only on w2 visa transfer with idexcel (h1b)- extensive knowledge of db sql etl oracle data migration experience- expertise - data warehousing expert - data transformation data wrangling data science- expertise - massive data migration expertise from sql to postgresql redshift- expertise in informatic scala unix shell and aws automation - familiarity with machine learning r python language desired skills:background in verizon partner solution (vps) applications education certifications:bachelor degree **description:**senior data engineericf is comprised of experienced professionals with the skills required to unlock the power of governed data discovery our staff provides deep technical and business support for data acquisition data analysis data science and interactive data visualizations job description:this is a cross functional position with significant responsibility and opportunity to shape the future direction of our team the selected candidate will join our growing team of analytics experts this position will be part of a small analytical team working on quantitative software products and services so experience working in a product development environment is desired successful candidates will be very analytical and detail-oriented to ensure a stable repeatable and accurate production data stream successful candidates will also enjoy a highly independent work environment where they can take a high level of ownership of their accomplishments what you ll be doing:+ create and maintain an optimal data pipeline architecture that include data ingestion processing and storage with a focus on consistency reliability and accuracy+ design and code workflows and optimize production data pipelines to ensure data accuracy+ perform hands on data analysis to provide insights and uncover potential data issues+ map data feeds and combine them with third party content along with necessary data standardization+ develop and code operational processes to automatically report on data quality conditions and kpis during processing+ be a part of fast moving development teams using agile methodologies+ identify design and implement internal process improvements: automating manual processes optimizing data delivery re-designing infrastructure for greater scalability etc + work with data and analytics experts to strive for greater functionality in our data systems **qualifications**+ 8 years of etl and database solution development supporting data integration data warehousing data lake and analytic pipelines+ 2 years of experience working within modern data architecture setup involving aws or azure and big data tools like hadoop hive spark kafka etc+ strong analytical and reasoning skills that result in clear robust and accurate solutionsstrong data debugging skills+ 5 years of experience working within the full lifecycle sdlc as part of a software product and operations team environment+ strong experience developing sql scripts data processing processes and performance tuning quantitative queries+ superior organizational and end to end delivery skills in an agile environment+ ability to manage multiple tasks and shifting priorities with proactive communication in mind+ ability to maintain strong working relationships with development and business teams based on a proactive communication stylepreferred skills experience:+ experience with etl tools like talend pentaho etc+ experience with scripting shell python java+ ability to work in a distributed team environment+ experience working on quantitative software products+ an outstanding academic record with a focus on science math and engineering or similar professional skills:+ ability to obtain a public trust clearance+ excellent listening interpersonal written oral and phone communication skills+ highly self-motivated and self-directed to solve complex technical problems+ ability to exercise independent judgment+ builds and maintains relationships with stakeholders to ensure buy-in and adoption of technology solutions+ self-motivated to continuously improve technical and professional skills+ ability to effectively prioritize and execute tasksicf is an equal opportunity employer that values diversity at all levels (eoe minorities females protected veterans status disability status sexual orientation gender identity)working at icfworking at icf means applying a passion for meaningful work with intellectual rigor to help solve the leading issues of our day smart compassionate innovative committed icf employees tackle unprecedented challenges to benefit people businesses and governments around the globe we believe in collaboration mutual respect open communication and opportunity for growth icftogether for tomorrow**location:** _district of columbia-washington_**requisition id:** _1800001021_ we are hiring a senior data engineer for a new purpose-driven vc-backed data and ai startup founded by adam bly we invite you to learn more about the vision and thesis for the company and the problems we're setting out to solve here you have: a proven record of personally taking large data projects from ideation to implementation expertise working with high volume heterogeneous data using distributed systems such as hadoop bigtable and cassandra expertise architecting building and operating large-scale batch and real-time data pipelines with data processing frameworks like scalding scio storm spark and dataflow strong knowledge about data modeling data access and data storage techniques experience with agile development preferably worked on graphs and open source data-related projects you are: a systems thinker who is naturally predisposed to connecting dots a scientifically-minded individual who generates hypotheses from observations conceives creative ways to test hypotheses presents arguments supported by data and changes your mind based on new data mission-driven and passionate about the problems raised in the post shared above ready to build something big and ambitious from the ground up you will: architect and build key components of our first product working with large amounts of heterogeneous data and tackling high-impact technical challenges build products that help advance data science and machine learning collaborate with experts in data science and ml and partners at top research labs leverage best practices in continuous integration and delivery be part of the founding engineering team of the company and help shape our engineering culture values and ways of working this is a very cool company - become a spark analytics expert work with the best and brightest make a lot of money get equity that is expected to be very valuable in a short amount of time and work on some of the best federal analytics projects some details are listed below our client is a product and services firm pre-ipo that specializes in data science and analytics for the federal government and commercial market they utilize apache spark as the foundation of their products and solutions they are seeking several big data solutions architects to join their growing team supporting the intelligence & defense communities in this role you will lead architecture development for new applications as well as work to plan and lead the migration of existing mission applications to spark this is not a sales role - it is a client-facing technical engineering role in addition to hands-on technical implementation work you will consult on architecture and design of projects that are of strategic importance to intelligence & defense agencies this role is not dependent on contract funding - our client has numerous engagements that utilize their products and services and when they find the right people they make firm offers without customer interviews being required there are a variety of possible work locations for solution architects the goal is to have a solution architect support one or more customers that are close to their residence to minimize commute times there are worksite options in maryland (the usual sites in the ft meade columbia ellicott city woodlawn areas) in virginia (fairfax mclean springfield chantilly herndon reston) and in dc on some projects a large portion of the work can be done remotely with only occasional in-person meetings with the customer if you live anywhere in the va dc md area and this is a good fit we'd like to speak with you to be considered for this role you need the following: - us citizenship and a current security clearance of ts sci or higher (fs poly options exist) - strong experience in the hadoop and or spark ecosystem - hands-on coding experience with scala or python hands-on spark coding experience is a big plus but not strictly required if your hadoop and scala python experience is strong good client-facing skills (consulting verbal and written communication etc ) are also important in this role in terms of compensation you can expect a strong salary a significant bonus opportunity a full benefits plan and equity (which can lead to substantial additional compensation) 11153 stanley reid & company is a technical and executive search agency that works with the most compelling firms in the us intelligence community and department of defense if this opportunity appeals to you and you meet the requirements please send us your resume if this is not the right fit for you please visit http: careers stanleyreid com to see a full list of our current opportunities in software engineering data science cyber security and cloud infrastructure candidate referral program: we will pay a $5 000 referral fee if you introduce us to a friend or colleague who we don't already know and who we are able to place at any of our clients within 1 year of the referral (referral fee paid when person reaches 6-month employment anniversary top secret sci **business title:** manager big data architect**requisition number:** 28764 - 6**function:** business support services**area of interest:****state:** ca**city:** irvine**description:**known for being a great place to work and build a career kpmg provides audit tax and advisory services for organizations in today's most important industries our growth is driven by delivering real results for our clients it's also enabled by our culture which encourages individual development embraces an inclusive environment rewards innovative excellence and supports our communities with qualities like those it's no wonder we're consistently ranked among the best companies to work for by fortune magazine consulting magazine working mother magazine diversity inc and others if you're as passionate about your future as we are join our team kpmg is currently seeking a senior architect big data software engineer to join our advanced analytics team responsibilities:+ rapidly architect design prototype and implement architectures to tackle the big data and data science needs for a variety of fortune 1000 corporations and other major organizations+ work in cross-disciplinary teams with kpmg industry experts to understand client needs and ingest rich data sources such as social media news internal external documents emails financial data and operational data+ research experiment and utilize leading big data methodologies such as hadoop spark redshift netezza sap hana and microsoft azure+ architect implement and test data processing pipelines and data mining data science algorithms on a variety of hosted settings such as aws azure client technology stacks and kpmg?s own clusters+ translate advanced business analytics problems into technical approaches that yield actionable recommendations across multiple diverse domains; communicate results and educate others through design and build of insightful visualizations reports and presentations+ develop skills in business requirement capture and translation hypothesis-driven consulting work stream and project management and client relationship developmentqualifications:+ bachelor?s degree from an accredited college university in computer science computer engineering or a related field and minimum seven years of big data experience with multiple programming languages and technologies; or master?s degree and a minimum of five years of experience; or phd in computer science computer engineering or a related field with minimum three years of big data experience+ fluency in several programming languages such as python scala or java with the ability to pick up new languages and technologies quickly; understanding of cloud and distributed systems principles including load balancing networks scaling in-memory vs disk etc ; and experience with large-scale big data methods such as mapreduce hadoop spark hive impala or storm+ ability to work efficiently under unix linux environment or net with experience with source code management systems like git and svn+ ability to work with team members and clients to assess needs provide assistance and resolve problems using excellent problem-solving skills verbal written communication and the ability to explain technical concepts to business people+ ability to travel up to 80%+ applicants must be currently authorized to work in the united states without the need for visa sponsorship now or in the futurekpmg llp (the u s member firm of kpmg international) offers a comprehensive compensation and benefits package kpmg is an equal opportunity employer all qualified applicants are considered for employment without regard to race color creed religion age sex gender national origin ancestry citizenship status marital status sexual orientation gender identity or expression disability physical or mental handicap unrelated to ability pregnancy veteran status unfavorable discharge from military service genetic information personal appearance family responsibility matriculation or political affiliation or other legally protected status kpmg maintains a drug-free workplace kpmg will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of applicable local state or federal law (including san francisco ordinance number 131192) no phone calls or agencies please **gl:** 4**gf:** 15304 1017561brreq id:1017561brcompany summary:the walmart ecommerce team is rapidly innovating to evolve and define the future state of shopping as the world’s largest retailer we are on a mission to help people save money and live better with the help of some of the brightest minds in technology merchandising marketing supply chain talent and more we are reimagining the intersection of digital and physical shopping to help achieve that mission job title:principal data architect engineeringposition summary:role - customer data quality and dr measurements specialist----- job description and requirements---------------------------------* build models and frameworks to assess quality and establish opportunities withcustomer data* mentor team of engineers and scientists (not a management role)* expert in machine learning and big data* experience in consumer data space and with data privacycity:sunnyvalestate:caposition description:+ assists in the development of engineers and architects+ cultivates an environment where associates respect and adhere to company standards of integrity and ethics+ demonstrates and proves hardware or software technology concepts to other architects and management+ develops and implements product development strategies using agile development processes+ develops and implements strategies to attract and maintain a highly skilled and engaged workforce+ develops and leverages internal and external partnerships and networks to maximize the achievement of business goals+ guides business strategy efforts+ identifies and implements strategies for service enabled technologies (for example service oriented architecture)+ improves the hardware or software technology environment+ influences the direction of engineering within the organization+ leads hardware or software technology projects+ leads hardware or software technology strategy development and implementation+ oversees the development of conceptual logical or physical hardware or software application designs+ provides overall direction for technical architecture of hardware or software systemsminimum qualifications:* 7 years hands on experience in data sciences platforms or applications* 15 years or more experience in engineering * experience working with artificial intelligence and machine learning* master's or doctoral degree (preferred) in math computer science or related engineering disciplines* publications and or patent authorshipcategory:software development and engineering division:walmart labsdivision summary:@walmartlabs is the technical powerhouse behind walmart global ecommerce we employ big data at scale -- from machine learning data mining and optimization algorithms to modeling and analyzing massive flows of data from online social mobile and offline commerce we don’t just engineer cool websites mobile apps and new services; we use our own open source tools to create the framework deployment is automated and accelerated through our open cloud platform this makes us incredibly nimble and able to adjust in real-time to our global customers employment type:full timerequisition template:ecommerce NA rockstar is seeking a senior-level data engineer with a passion for big data technologies to join a team focused on building a cutting edge game analytics platform and tools to better understand our players and enhance their experience in our games the ideal candidate will be skilled in developing complex ingestion and transformation processes with an emphasis on reliability and performance in collaboration with other data engineers database administrators and developers the candidate will empower the team of analysts and data scientists to deliver data driven insights and applications to company stakeholders responsibilitiesetl design and development – assist in the development of a big data platform in hadoop using pipeline technologies such as pig spark oozie and more to support a variety of requirements and applications warehouse design and development – set the standards for warehouse and schema design in massively parallel processing engines such as hadoop and vertica while collaborating with analysts and data scientist in the creation of efficient data models implement and support big data tools and frameworks such as hdfs hive and impala implement and support streaming technologies such as kafka and spark assist in the development of deployment automation and operational support strategies deliver near-real time and non-near-real-time data and applications to a team of analysts and data scientists who create insights and analytics applications for our stakeholders qualifications7+ years of work experience with etl data modeling and business intelligence architectures expert in at least one sql language such as t-sql or pl sql experience developing and managing data warehouses on a terabyte or petabyte scale strong experience in massively parallel processing & columnar databases experience working in a linux environment experience with python and shell scripting deep understanding of advanced data warehousing concepts and track record of applying these concepts on the job ability to manage numerous requests concurrently and strategically prioritizing when necessary good communication skills dynamic team player a passion for technology - we are looking for someone who is keen to leverage their existing skills and seek out new skills and solutions preferredexperience with hadoop ecosystem experience with vertica experience with tableau administration experience in implementing a machine learning pipeline experience in real-time analytics applications the platform architect works within the ml ai group and is responsible for the ongoing development of the data & machine learning platform this role will participate in all aspects of the data application life cycle including analysis design development testing production deployment and support the role will help formulate approaches to improve existing processes develop opportunities and present innovative solutions on cutting-edge technologies you will create big data accelerators to help deploy scalable solutions fast the platform architect will work with data architects and data scientists to evolve the data ml platform while delivering both strategic and tactical projects to provide iterative value through impactful business outcomes this individual should be highly self-motivated with a strong problem-solving and analytical nature love of learning passion for insight and desire for achieving results duties and responsibilities• design and build data processing pipelines for structured and unstructured data using tools and frameworks in the big data ecosystem• architect write code complete programming and perform testing and debugging of data applications• design and develop new systems and tools in a number of languages to facilitate effective data ingestion and curation; batch scalable event frameworks streaming and real-time data analytic pipelines• build data apis and data delivery services that support critical operational and analytical applications for our internal business operations customers and partners • implement and configure big data technologies as well as tune processes for performance at scale• design develop and maintain conceptual logical and physical data models for big data systems through schema on write and schema on read delivery • perform data analysis with business understanding business process and structuring both relational and distributed data sets • build and provision analytical sandboxes for data scientists and analysts work with data science team to bring machine learning models into production• conduct timely and effective research in response to specific requests (e g data collection summarization analysis and synthesis of relevant data and information)• devops for unit functional integration and regression test plans communicate with qa and port data engineering test scripts to qa team• evaluate benchmark and integrate cutting edge open source data tools and technologies• monitor performance of the data platform and optimize as needed• work with machine learning and deep learning pipelines• work with diverse data including images words video audio qualifications• proficient understanding of distributed computing principles• programming development on hadoop include understanding of zookeeper oozie and yarn about the rolewe are looking for a strong technical contributor with a background in software development to create intelligent data-driven systems as a seasoned software engineer you will be responsible for all phases of the development cycle: design implementation testing and release you will leverage your deep knowledge and experience to provide technical leadership for the team take ideas from zero to completion and provide the bridge between raw data and actionable business insights you will:architect build and maintain highly-scalable etl pipelines and data-driven systemswork closely with machine learning engineers and data scientists to productionalize analytic solutionsdesign and implement new data products to support data scientists and business analysts design and build infrastructure for our data lakeabout the teamwe build the pipelines and processes responsible for daily ingestion of terabytes of data we productionalize intelligent data-driven systems to help zillow capture strategic opportunities in the market our work enriches zillow’s unparalleled living database of all homes and hundreds of millions of customers and empowers teams downstream to build analytics tools and products to delight our users small team = big impact engineering teams are highly decentralized in order to create the small team speed and autonomy of a start-up environment but backed by big company resources fast-moving developer driven organization full of brilliant and ambitious people learn more about what we are doing at https: www zillow com engineering and https: www zillow com data-sciencewho are you?deep experience with building and shipping highly scalable distributed systems on cloud platforms (aws azure gcp) and database technologies (sql nosql column-oriented datastores distributed databases)experience with the big data ecosystem (hadoop hive spark presto airflow)proven track record of leading and delivering large projects independentlyproven ability to learn new technologies quicklya degree (bs ms+) in computer science or a related technical disciplineexperience with applied machine learning systems a strong plusget to know uszillow group houses a portfolio of the largest and most vibrant real estate and home-related brands on the web and mobile our mission is to build the largest most trusted and vibrant home-related marketplace in the world zillow group is owned fueled and grown by innovators who help people make better smarter decisions around all things home we encourage one another at every level and our efforts are supported by employee-driven world-class benefits that enable us to enjoy our lives outside the office while building fulfilling careers that impact millions of individuals every day zillow group is an equal opportunity employer committed to fostering an inclusive innovative environment with the best employees therefore we provide employment opportunities without regard to age race color ancestry national origin religion disability sex gender identity or expression sexual orientation or any other protected status in accordance with applicable law if there are preparations we can make to help ensure you have a comfortable and positive interview experience please let us know title : sr big data engineer or architectlocation : downtown chicago ilduration : long term job description :the enterprise data systems team is seeking a hands on developer who can bring their experience and expertise to drive many data focused initiatives forward candidate will be responsible for development and support of many different big data applications utilizing cutting edge cloud based technologies this candidate will play a hands on role on a team responsible for delivering mission critical financial data to support business intelligence data science and operational applications qualifications:hands on application developer with a passion for technologyworking knowledge of ci cd pipelines requiredexperience with cloud technologies – aws certifications a plusdeep understanding of and extensive experience with streaming & batch big data technologies (hadoop kafka emr mapreduce spark flink etc )strong experience with java development and solid understanding of distributed technologiesexperience with deployment and configuration tools a plus (chef docker etc )working knowledge of sql and relational databases data modeling and design patternsstrong experience with linux unix and scripting languages ability to multi-task and work in a fast paced and dynamic environment with extremely tight timelinesfamiliarity with nosql or mongodb or hbase or cassandra is a plus 8+ years software development experience with 4+ years’ experience with data related technologiescomputer science software engineering computer engineering or similar degree (or demonstrate equivalent experience)financial industry knowledge including derivatives trading or clearing a plus sampath925-399-8510sampathc@msrcosmos com - provided by dice working knowledge of ci cd pipelines required (hadoop kafka emr mapreduce spark cloud technologies aws certifications a plus we are looking for a talented data engineer to join our technology team the individual will be working on large-scale data warehouse search engine applications and multiple etl processes individual must have a strong passion for data engineering warehousing and etl technologies and be willing to work in a challenging team environment that promotes personal accountability key responsibilities and dutiesparticipate in product delivery through the entire sdlc of analysis design development and testingbuild data warehousing search engine and business intelligence (bi) applicationsestablish highly-secure etl processes across multiple data sources and sinks within our ecosystemclosely collaborate with data analytics bi engineering and devops teamsconsistently perform at the highest levels and bring high energy confidence and ambition to work every dayqualifications and skills2+ years of strong experience in data warehousing and etl processesdemonstrated hands-on solid experience with writing complex sql and building high volume etl pipelines and batch processproficient and can code using languages like bash scripts python java and or scalaexperience with pentaho data integration (pdi) and aws redshift additional experience with aws kinesis elastic stack apache spark flink and storm is a big plus!experience working with the fast moving development team using agile scrum methodologya minimum of bachelor's degree (four-year degree) is required to be considered for this position preferably in computer science engineering science mathematics or related fieldcandidate must be a u s citizen to be considered for this positionwork location is on-site at washington d c with easy access to public transportationbenefitswe seek talent and in exchange we offer very competitive pay and full benefits including medical dental vision life insurance accidental death and personal loss benefits long-term disability flexible spending account voluntary short-term disability voluntary term life insurance voluntary accident insurance 401(k) retirement plan dc va md transit benefit program paid vacation sick leave and other employee benefits data visualization developerit cyber security network systemsrosslyn virginia**description**_position at actionet_**data visualization developer**actionet is seeking a data analyst and visualization specialist who enjoys working directly with clients to design and implement custom data analytic and reporting solutions including dashboards in a dynamic and fast-paced environment this includes working alongside a diverse team of icf data scientists data engineers and business analysts to strengthen data integrity enhance business processes and improve program design through insightful analysis to government and commercial clients the ideal candidate will have strong capabilities in the following areas:what you’ll be doing+ perform extensive data profiling and analysis of client data + support project delivery on data warehouse bi projects for external and internal clients technical and non-technical including partnering with icf subject matter experts on project execution + link reporting needs to existing bi tools and help identify solutions that provide best business value to the client + develop custom reports and data visualization products using large datasets to transform data into actionable insights+ coding and creating impactful reports interactive dashboards and visualizations using **business objects** **microsoft powerbi** **tableau** and other bi tools + learning from and sharing knowledge and skills with your teammates to grow bi’s total impact to the organization + using json xml javascript and css to create web based representations of databasic qualifications+ 2+ years of relevant experience in report development+ solid knowledge of ui design ( html5 css javascript python)+ excellent oral and written communication skills and comfort presenting to everyone from entry-level employees to senior stakeholders+ curiosity and passion about data visualization and solving problems+ minimum secret clearancepreferred skills+ knowledge of sql to write complex highly-optimized queries across large volumes of data a plus+ the ability to integrate geospatial components into dashboards and reportsactionet is an equal opportunity affirmative action employerall qualified candidates will receive consideration for employment without regard to disability protected veteran status race color religious creed national origin citizenship marital status sex sexual orientation gender identity age (40 or over) or genetic information actionet’s commitment to diversity and inclusive selection practices includes ensuring qualified long-term unemployed job seekers receive equal consideration for employment the actionet career center is accessible to any and all users if you would like to contact us regarding the accessibility of this portal or you need assistance completing the application process please contact jonathan dobles technical recruiter at 703-204-0090 ext 195 or jdobles@actionet com this contact information is for accommodation requests only and cannot be used to inquire about the status of applications job description: are you someone with a passion for data analytics insights and technology? do you want to be part of a team lighting up actionable insights to help the organization make business decisions? if you excel in blending quantitative analysis with strategy development and want to influence the future of core services at microsoft then this position is for you the data team in shared services (part of core services engineering) is looking for a passionate creative analytical and experienced data engineering leader if you are passionate about big data data science and telemetry driven insights then this is the role for you as a principal engineering lead you will lead our high performing data engineering team which is responsible for designing developing and supporting complex data platform to enable enterprise level bi & data analytical solutions for cseo organization this role requires cutting edge cloud and big data technical skills as well as excellent communication and collaboration skills you should be a data expert with solid understanding of large scale data ingestion & processing from architecture to coding you should be comfortable owning a holistic design of complex data platform and making architectural decisions independently you are driven self-directed entrepreneurial and focused on delivering the right results to be successful in this role you must have strong skills in written and oral communications a can-do attitude and the willingness to tackle hard problems in innovative ways **responsibilities**key responsibilities:• connect findings and recommendations to business initiatives and collaborate with key stakeholders at various management levels• engagement with partner organizations and customers to develop technical roadmaps backlogs that align to customer’s business requirements • lead a data engineer team to analyze complex high-volume high-dimensionality data from varying sources using a variety of etl and data analysis techniques• lead a data engineering team to architect design and develop highly-scalable and reliable end-to-end enterprise data platform & bi reporting solutions • lead and mentor team members to embrace and adopt new technologies knowledge experience and skills:• expert understanding in full life cycle of software development & engineering excellence processes • solid analytical and problem-solving ability good at making difficult architectural design decisions • strong collaboration skills to collectively work with customers users business analysts and developers to define data requirements and turn them to actionable project plan • strong technical skills and experience in turning technology strategy to hands-on implementation from vision to proof-of-concept to production code • in depth knowledge and extensive practical experience in leading analysis model design of the platform data structure storage integration deployment and support to enable enterprise bi and advanced analytics to gain data insights **qualifications**basic qualifications:• 5-7 years of hands-on experience in object oriented and or sql programming• 5+ years in a lead or manager rolepreferred not required:• 1+ years of hands-on development experience in microsoft azure and big data ecosystem such as hadoop hive spark hdinsight and cosmos • familiar with key microsoft azure data ingestion tool such as azure event hub azure data factory and azure stream analytic services• bachelor in statistics math computer science economics business or engineering • advanced bi dashboard design & development • familiar with machine learning and ai technologies tools is a big plus • business domain expertise in areas of sales marketing supply chain finance and or hr the ideal candidate will have experience in a team environment experience running and designing enterprise scale services and platforms technical depth in cloud platforms agile development practices and experience in designing & tuning telemetry in addition this position requires an individual who can demonstrate the ability to ensure highly resilient and scalable service designs through partnership with other members of the service team microsoft is an equal opportunity employer all qualified applicants will receive consideration for employment without regard to age ancestry color family or medical care leave gender identity or expression genetic information marital status medical condition national origin physical or mental disability political affiliation protected veteran status race religion sex (including pregnancy) sexual orientation or any other characteristic protected by applicable laws regulations and ordinances if you need assistance and or a reasonable accommodation due to a disability during the application or the recruiting process please send a request to askstaff@microsoft com role: hadoop lead architectlocation: san francisco caqualifications and requirements:responsibilities- develop technical content such as white papers and reference architectures that can be used by our customers to assist them in deploying cloud-based analytic solutions- plan and execute a technology proof-of-concept (poc) using big data technology- experience in handling the structured and unstructured data using big data (any tools best practice & industry trends) - experience architecting & implementing the full life-cycle of a data analytics hadoop solution basic qualifications- three or more years of experience in the design and implementation of data analytics environment using hadoop solutions - significant hands on experience with data analytic tools including apache hadoop spark storm kafka- machine learning: mahout r etc- query: hive pig redshift - hands on experience with other hadoop eco-system components and or nosql databases – cassandra mongodb couchdb marklogic hbase mongo db cassandra- exceptional interpersonal and communication skills- demonstrated effectiveness working across multiple business units to achieve results- an understanding of cloud computing deployment models as they relate to data and analytics- five or more years of experience in a data-related role such as data scientist or data architect would be a great plus enterprise data architectapply now »apply nowstart+ please wait date:mar 23 2018location:boston ma us 02110company:houghton mifflin harcourtjob requisition id: 11561additional locations: the opportunity: enterprise data architectabout the job:hmh learning technology is looking for an enterprise data architect who is a hands-on analyst technologist and architect with deep experience in data architecture integration data warehousing and analytics technologies in this role you will provide tactical and strategic direction in the area of data conversion business intelligence analytics and visualization this person leads design for relational databases following industry best practice the data integration architect will consult with engineering and data analytics to design and implement scripts programs databases software components and analyses that will support product quality and an in depth understanding of potential uses of the data the data integration architect has business responsibility for determining the information the enterprise will capture retain and exploit this position is located in boston massachusetts duties & responsibilities include but are not limited to the following:+ work closely with clients on customer-specific initiatives that involve integrations between our product and one or more external systems tools + determine apis that needs to be created and then architect design and develop the apis using different languages all dependent on the client's environment+ define standards for naming describing governing managing modeling cleansing enriching transforming moving storing searching and delivering all data within the product+ serve as the liaison between data consumer representatives and data solution development integration and governance team+ conduct requirements gathering and design sessions with stakeholders with data suppliers to understand incoming data and relationships+ work with business and engineering teams to apply data security policies+ create conceptual logical and physical models for data warehouse implementations+ provide source to target mappings from data models to provide basis for data integration specifications+ design time variant models that support both point in time (historical) and current state analysis+ assist with sla definition analysis and application performance tuningrequirements skills & experience:this person is a self-motivated individual with excellent team collaboration and organizational skills who possesses:+ experience as a data integration architect developing complex logical data models that support business processes+ experience with one or more of the following:+ big data data analytics data science spark cloud technologies such as azure or aws hadoop hana or databricks + experience in developing data models for data warehouse reporting and analytic solutions (dimensional models)+ understanding of data warehousing data governance and master data management best practices+ broad knowledge and application of architecture design patterns supporting multiple processing styles and technical requirements+ advanced or expert knowledge and understanding of multiple it areas (networking software data security internet operations) or architecture and core business functions+ experience applying data security policies+ experience in working with data owners in developing data and integrating data requirements in the larger development effort in coordination with project managers business analysists and development leadsphysical requirements:+ might be in a stationary position for a considerable time (sitting and or standing)+ the person in this position needs to move about inside office to access file cabinets office machinery etc + constantly operates a computer and other office productivity machinery such as a calculator copy machine and computer printer+ must be able to collaborate with colleagues via face to face conference calls and online meetings*li-jh1about us:houghton mifflin harcourt (nasdaq:hmhc) is a global learning company dedicated to changing people’s lives by fostering passionate curious learners as a leading provider of pre-k–12 education content services and cutting-edge technology solutions across a variety of media hmh enables learning in a changing landscape hmh is uniquely positioned to create engaging and effective educational content and experiences from early childhood to beyond the classroom hmh serves more than 50 million students in over 150 countries worldwide while its award-winning children's books novels non-fiction and reference titles are enjoyed by readers throughout the world for more information visithttp: careers hmhco com houghton mifflin harcourt is committed to a comprehensive policy of equal opportunities and we aim to create a workplace which provides for equal opportunities for all employees and potential employees nearest major market:bostonjob segment: do you have a serious knack for data?big data? the internet of things? data science? can you find the hidden patterns inside complex data sets? are you passionate about exploring the unknown in the known? is it fun for you? it’s fun for us we know data; and we should we’ve been doing it since we started we employ the most curious and brilliant data geeks in the area we’ve watched the industry morph from rows and columns to sentiment analysis and from text to speech fact extraction we’ve built enterprise kpis and metrics on terabytes and we’ve combed petabytes to identify individual customer patterns we’ve helped customers integrate master records and built algorithms to predict part failures in aircraft data is growing at an exponential pace and continuously evolving we want seekers of knowledge with a passion for data to solve the ever-changing puzzles of the business environment are you ready to decipher data for us? reach out to us today ilw is a motivated fast-growing solution enablement company we provide expert level consulting design and development services that deliver results for our clients we look for the best of the best those who are attracted to a challenging and rewarding career experience not just a job our ideal candidate exhibits passion patience and perseverance with an entrepreneurial mindset we have recruited and retained some of the best technical and professional talent in the industry summary:the big data developer is typically a java scala developer with interest in data big data is about finding answers to questions you haven’t thought to ask yet it involves aspects of database design and development algorithm development and data science using entirely disparate data types and sources do you have what it takes?excellent verbal and written communication skills and the ability to interact professionally with a diverse group executives managers and subject matter expertsoutstanding problem-solving skillsability to successfully pass a post-offer pre-employment drug test and background screendesire to grow career within big data bachelors or ms degree in computer science mathematics or comparable academic discipline2+ years of demonstrable big data experienceexperience in debugging profiling and load testing on jvm particularly distributed applicationsexperience with sql server or other relational database management systemknowledge of open-source tools technologies platforms or databasesexpertise in hadoop hive pig hue mysql mongodb etc familiarity with linux administration and shell scriptingdata modeling experiencedata exploration and discovery experiencedata processing knowledge to include batch and streaming mechanismsexperience with apache hdfs hive mapreduce yarn sparkability to work with both internal and external partners to ensure that respective data sets are ingested into our big data platformknowledge of big data and development best practicesability to assist with analytic tooling enablement and capability within the big data platform why choose us?as a company we invest in our employees in all aspects of your life we understand that the health of yourself and families are very important; along with your time here at ilw listed below you will find some of the top benefits and perks if you choose to be a part of our team market competitive salary generous pto package comprehensive medical dental vision and life insurance plans401kshort and long term disability insurancefun & engaging cultureongoing training education and industry partnerships that allow you to be up to speed on the latest technologies and processes illumination works llc is committed to hiring and retaining a diverse workforce we are an equal opportunity employer making decisions without regard to race color religion sexual orientation gender identity or national origin age veteran status disability or any other protected class u s citizenship is required for most positions if you are interested in joining our team and being part of an outstanding group of it professionals providing innovative solutions to a wide variety of clients and feel you meet the prior requirements please contact us! job details: job description: at mcafee we are relentless in protecting our customers: we believe safe never sleeps - in doing so we are leading the transition from the era of point security products to integrated security systems - as we automate the threat defense lifecycle we lead the industry in delivering better security outcomes to customers:- address more threats faster and with fewer resources - we work to achieve that automation for enterprise customers by:focusing on the endpoint and cloud as key control points analyzing that telemetry to detect advanced threats automating the management and remediation processes working with technology partners throughout the industry mcafee is looking for a hands-on software architect to design and build a next-generation telemetry and analytics system this will be a big data system: ingesting processing and storing large amounts of data around the clock at high velocity the ideal candidate will have experience building large data processing pipelines ideally in a public cloud using best-of-breed big data technologies you will be reporting directly to the engineering and operation director and will be collaborating with experienced engineering operations and data science teams as well as a database architect you will be designing and building a system that must operate reliably at scale to acquire telemetry via mcafee-s product footprint in the market and feed that telemetry to downstream systems that will use this data to generate better and faster protection against internet threats to keep mcafee-s customers safe qualifications: experience with building big data processing systems in the public cloud (aws azure) that operate at high velocity and high throughput (fast not just big data) experience with leading-edge big data technologies such as kafka spark hadoop mongodb sql experience with large data-at-rest technologies such as data marts and bi and systems that support and feed into cold storage and process batch data experience with data pipelines that must correlate data with non-homogenous systems solid understanding of data hygiene data quality and data management governance for example data flow documentation and measurement experience building systems with high uptime (3+ nines) and reliability requirements experience designing and supporting data pipelines for a large number of stakeholders with disparate needs good communication skills to effectively work with a diverse set of engineering operations data science technical product and managerial stakeholders understanding of modern software engineering methodologies such as devops to build systems that can be operated with a minimum of administrative overhead an understanding of computer security ideally familiarity with various certification and federal government criteria our mission: to relentlessly protect all that matters through leading edge cyber security from your workplace to your home and everywhere in between our vision: to enable a world where cyber security is so consistent reliable and- effective that it becomes a trusted foundation in our lives - like clean air and water our technology enables the world to fully realize the- transformative power of the digital age by protecting all that matters by doing our job well we drive limitless innovation securely our values: we live our values day in and day out do you think you can live our values with us? if you can don-t think just connect with us together is power we achieve excellence with speed and agility we play to win or don-t play we innovate without fear we practice inclusive candor and transparency we put the customer at the core join our talent community :- http: careers mcafee com mcafee prohibits discrimination based on race color religion gender national origin age disability veteran status marital status pregnancy gender expression or identity sexual orientation or any other legally protected status job type: experienced hireshift: shift 1 (united states of america)primary location: us oregon hillsboroadditional locations: posting statement: mcafee prohibits discrimination based on race color religion gender national origin age disability veteran status marital status pregnancy gender expression or identity sexual orientation or any other legally protected status last modified: 2 15 2018 5:54:34 pm department: engineering - provided by dice database director foundation hadoop http lifecycle management mongodb security sql job location: united states : north carolina : caryrole value propositionmetlife is seeking an it director of data architecture to join its centralized enterprise architecture team focused on delivering strategic architectural solutions that enable metlife to meet its business goals this director will lead a senior team of data architects and will be a core member of the digital channels and data architecture team the director will be responsible for the day to day operations of the data architecture team including project resource and personnel management as well as technical oversight the director will be responsible for ensuring the delivery of conceptual and solution architecture designs across the global analytics and data services portfolio in this position the focus will primarily be on database data quality big data and analytics platforms supporting multiple global tenants in this role the director will be responsible for the architecture of a cutting-edge big data platform based on hadoop hortonworks and advanced analytics tooling this position will also be involved with defining data platform architectures using cloud-based container infrastructure the director will ensure that all solution architectures and new technology acquisitions are delivered through metlife’s architectural governance review processes and follow established enterprise reference architectures and standards they will also collaborate with global it partners in architecture and development to deliver future state architectures for the portfolios assigned the ideal candidate will have strong business and technology acumen in order to develop sound architecture solutions that will meet and exceed business needs as well as experience managing senior technical teams in an enterprise setting skills in big data and data science practices along with the capacity to adapt to changing technology landscape are essential effective communications with it and business associates at all levels including senior management will be required this position may require some travel depending on current assignment as well as working with global partners in time zones outside of the u s functional responsibilities+ lead the data architecture team from a technical and personnel management perspective+ manage the creation analysis and review of data-related solution architectures and ensures their integrity security and strategic alignment+ leads multiple complex or large-scale projects as assigned+ leads the project planning of architecture activities and provides estimates for architecture work+ leads the development of future state architectures for a given portfolio+ maintains a broad deep understanding of current business applications and technical platforms in the global analytics and data services portfolio+ maintains a broad deep understanding of multiple industry technologies and reference architectures+ leads the development of horizontal solutions for projects programs+ partners with application development teams in validating and implementing proposed solutions + partner with global partners in architecture and it to ensure that global solutions accommodate local regional requirements including language support and data sovereignty restrictions+ prepares presentations tailored for various audiences (e g executive management business partners it partners etc )+ ensures the completion of product assessments and vendor evaluations as needed and guides these through the approved software governance processes+ ensures architecture standards (reference architectures approved technology stacks) and best practices are adhered to+ has ability to maintain big picture focus while working on individual solution architectures+ proactively keeps management informed of project status issues and dependencies+ mentor and lead the professional development of team membersrequired qualifications+ minimum of 5 to 8 years of related architecture experience+ 3 + years of hands on experience in software engineering development+ bachelor’s degree in computer science or engineering discipline+ experience directly managing technical staff+ demonstrated proficiency in coaching and mentorship+ demonstrated oral and written communication proficiency+ demonstrated influencing and negotiating skills+ experienced at effectively interacting with associates at all levels+ proficient at working independently with minimal supervision while consistently meeting all expectations+ ability to lead and or work simultaneously on multiple large complexprojects+ demonstrated expertise in one or more architecture disciplines (e g application architecture data architecture infrastructure architecture integration architecture etc )+ demonstrated experience in understanding analyzing applying and communicating abstract architectural concepts+ proficient at working within the software development life cycle (sdlc) in a large enterprise+ broad knowledge of web application concepts design patterns and technologies (e g single page applications microservices etc )+ broad knowledge of integration technologies and patterns such rest etl mft soa mq etc + broad knowledge of big data concepts design patterns and technologies (e g hadoop data streaming etc )+ broad knowledge of api management api design and api security (oauth basic auth etc )+ experience with bi reporting and analytics + experience with the architecture and design of data-centric projects (e g data warehouse data lake data marts etc )+ extensive experience with information architecture data integration data management and database technologies (e g rdbms nosql mongo hadoop hortonworks etc )+ experience with cloud concepts (iaas saas paas microservices serverless) and major cloud providers such as microsoft azure or aws+ experience with security architecture patterns and technologies (e g web access management single sign on federation data encryption etc )preferred qualifications+ experience managing a senior team of technical architects+ experience with big data platforms particularly hortonworks and usage in a large enterprise environment+ experience with data science machine learning and advanced analytics technologies and usage in a large enterprise environment+ experience working with customers partners outside of the us including asia and latin americametlife is a proud equal opportunity affirmative action employer committed to attracting retaining and maximizing the performance of a diverse and inclusive workforce it is metlife's policy to ensure equal employment opportunity without discrimination or harassment based on race color religion sex (including pregnancy childbirth or related medical conditions) sexual orientation gender identity or expression age disability national origin marital or domestic civil partnership status genetic information citizenship status uniformed service member or veteran status or any other characteristic protected by law metlife maintains a drug-free workplace **for immediate consideration click the apply now button you will be directed to complete an on-line profile upon completion you will receive an automated confirmation email verifying you have successfully applied to the job **requisition #: 96414 sauce labs provides the world's largest automation cloud for testing web and native hybrid mobile applications founded by the original creator of selenium sauce labs helps companies accelerate software development cycles improve application quality and deploy with confidence across 500+ browser os platforms join us in making the world a better place for continuous integration and software development we’re building a next generation infrastructure as a service platform responsibilities:the candidate for this role is the glue between database engineering (op’s) and data science and analyticswork with data architect data science and analytics prototype and build data solutionswork with data architect and analytics team to define and build data tables for dashboard reporting and pre-processed datasets for advanced analytics and machine learningwork with security to implement data privacy and data security requirements to ensure solutions stay compliant to security standards and frameworks (such as iso soc-ii etc )contribute to the core design of data architecture data models and schemas and implementation planwrite python java scala scripts for data cleaning normalization standardization and aggregationintegrate data from various data stores to ensure consistency and availability of data insightsuse best practices in terms of testing monitoring alerting auto-recovery design patterns etc requirements:bs or ms in computer science engineering or a related technical discipline or equivalent experience3+ hosted sql data warehousing for ad-hoc analytics such as redshift athena and bigquery3+ years experience operating databases (e g redshift mysql mongodb) and advanced query authoring3+ years of dimensional data modeling & schema design in data warehouses3+ years of experience with looker or other bi tools3+ data archiving partitioning and offloading5+ python scriptingproven experience and hands on on dimensional modelingmysql query optimizationfamiliarity of how mysql replication clustering worksexcellent communication skills particularly translating between technical and non-technical stakeholders company descriptionred alpha is a fast growing high-tech government contractor providing exceptional consulting and engineering services based in hanover maryland red alpha is committed to our clients employees and our mission our team consists of dedicated self-starters with the unique ability to succeed our company is employee-focused with superior benefits involved leadership and a proven history of success red alpha seeks self-motivated and passionate professionals because results follow hard work and passion we believe that our staff is our greatest asset because they are both our implementers and facilitators and as such we heavily invest in them so that they reach their career goals while satisfying their penchant to continuously grow join us as we apply our skills to tackle some of the toughest most interesting and rewarding challenges in the intelligence sector red alpha wants you to become a leader in the industry and will work with you to enable our mutual success we are rapidly growing and continuously searching for the best and brightest talent to add to our prestigious group of engineers we have many immediate openings based in the hanover maryland area and in northern virginia intrigued? please apply today! we would love to discuss your career goals to see how we can partner together to achieve them!red alpha specialties: cloud computing & administration enterprise java web application development hpc system administration data science and much more job descriptionsr sap hana developer team lead important note: resources must be us citizens in order to be considered; this is a public sector federal client with data security access requirements description requirements:• our client is currently seeking multiple sr level sap hana development resources to participate in a full-scale big data expansion program which will integrate the hortonworks hdp platform as an enterprise data foundation in conjunction with sap hana as a real time analytics platform • any experience knowledge with hortonworks is a huge plus!• qualified resources must have a minimum of 3 + yrs of sap hana hands-on development experience working with core hana data provisioning tools concepts (sap data services dxc slt) proven modeling skills within hana developer studio (analytic & calc view development) experience with sap data visualization exploration reporting & user experience tools (lumira hana live & hana xs fiori) additional 7-10 years of sap bw development experience required to support continuing expansion of bw on hana framework to include migrating bw to bw on hana • experience and or knowledge with hana vora and java rpc to hadoop connectivity highly desired• excellent communication skills required; must be a us citizen in order to be consideredqualificationsnulladditional informationred alpha is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race color religion sex national origin disability status protected veteran status or any other characteristic protected by law our mission is to help people everywhere find a job and company they love we are disrupting an industry by changing how people search for jobs and how companies recruit top talent we are looking for a talented engineer to join our growing data engineering team the ideal candidate has significant experience in building scalable data platforms that enable business intelligence analytics data science and data products you must have strong hands-on technical expertise in a variety of technologies and the proven ability to fashion robust scalable solutions you should have a passion for continuous improvement and data quality we embrace a wide variety of technologies and work very closely with data scientists and business stakeholders to deliver end to end solutions if you are interested in a fast paced environment the latest technologies and fun data problems come join us!responsibilitiesdesign and develop big data applications using a variety of different technologies develop logical and physical data models for big data platforms automate workflows using apache airflow write data pipelines using apache hive apache spark apache kafka create solutions on aws using services such as lambda and api gateway provide ongoing maintenance and enhancements to existing systems and participate in rotational on-call support learn our business domain and technology infrastructure quickly and share your knowledge freely and proactively with others in the team key qualifications5+ years of hands-on experience with developing data warehouse solutions and data products 2+ years of hands-on experience developing a distributed data processing platform with hadoop hive spark airflow kafka etc 2-3 years of hands-on experience in modeling and designing schema for data lakes or for rdbms platforms experience with programming languages: python java scala etc experience with scripting languages: perl shell etc practice working with processing and managing large data sets (multi tb pb scale) exposure to test driven development and automated testing frameworks background in scrum agile development methodologies capable of delivering on multiple competing priorities with little supervision excellent verbal and written communication skills bachelor's degree in computer science or equivalent experience nice to haveexperience building machine learning pipelines or data products familiarity with aws or gcs technologies be passionate about or have contributed to open sourced engineering projects in the past why glassdoor?work with purpose – join us in creating transparency for job seekers everywhereglassdoor gives back! glassdoor is a pledge 1% member; all employees receive 3 paid volunteer days per year100% company paid medical dental vision life coverage; 85% dependent coverageequity in a late stage startup backed by top-tier vcssunny & peaceful mill valley offices located right on the waterwalking running and biking trails steps away from the officeonsite gym and fitness classesfree catered lunch; new menu dailypaid holidays and flexible paid time offyour choice between mac or pcdog-friendly office (with dog-free zones if you are so inclined)free parkingglassdoor is committed to equal treatment and opportunity in all aspects of recruitment selection and employment without regard to gender race religion national origin ethnicity disability gender identity expression sexual orientation veteran or military status or any other category protected under the law glassdoor is an equal opportunity employer; committed to a community of inclusion and an environment free from discrimination harassment and retaliation do you want to build the premium shopping experiences for millions of customers? do you want to work on performance challenges for providing the best recommendations less than 200 milliseconds given millions of customers and millions of products? are you interested in working on machine learning and data science believing every customer should not have the same experience? as a senior data engineer you will be working in one of the world' s largest and most complex data warehouse environments you should be an expert in the architecture of dw solutions for the enterprise using multiple platforms (rdbms columnar cloud) you should excel in the design creation management and business use of extremely large datasets you should have excellent business and communication skills to be able to work with business owners to develop and define key business questions and to build data sets that answer those questions above all you should be passionate about working with huge data sets and someone who loves to bring datasets together to answer business questions and drive change qualifications· a desire to work in a collaborative intellectually curious environment · degree in computer science engineering mathematics or a related field or 7+ years industry experience · demonstrated strength in data modeling etl development and data warehousing · data warehousing experience with oracle redshift teradata etc · experience with big data technologies (hadoop hive hbase pig spark etc )preferred qualifications· industry experience as a data engineer or related specialty (e g software engineer business intelligence engineer data scientist) with a track record of manipulating processing and extracting value from large datasets · experience building operating highly available distributed systems of data extraction ingestion and processing of large data sets · experience building data products incrementally and integrating and managing datasets from multiple sources · query performance tuning skills using unix profiling tools and sql · experience leading large-scale data warehousing and analytics projects including using aws technologies – redshift s3 ec2 data-pipeline and other big data technologies · experience providing technical leadership and mentor other engineers for the best practices on the data engineering space · linux unix including to process large data sets · experience with aws - some experience leveraging sas r or matlab to manipulate data and set up automated processes as per business requirement - strong ability to interact communicate present and influence within multiple levels of the organization - masters degree - excellent communication skills to be able to work with business owners to develop and define key business questions and to build data sets that answer those questions catapult provides equal employment opportunities (eeo) to all employees and applicants for employment without regard to race color religion sex national origin age disability or genetics this policy applies to all terms and conditions of employment including recruiting hiring placement promotion termination layoff recall leaves of absence compensation and training new relic senior data engineersan francisco cathe mission of the growth automation team is to provide deep analytics on customer engagement and maturity to our internal teams so that we can automatically deliver contextual relevant data-driven experiences based on the lifecycle stage of the customer our team is tasked with building a data platform on top of the incredible amount of business user and customer data that new relic possesses as well as delivering valuable insights on top of that platform to the teams that need it we are looking for a talented senior data engineer to join us in building a scalable data ingest processing and analysis infrastructure to enable our coworkers and ourselves to analyze vast quantities of customer usage data collected and stored across a wide variety of data storage tools you will contribute to the vision for data infrastructure and business intelligence tools work with engineers and data analysts to establish best practices for table schemas and data storage you should also posses deep technical skills be comfortable working in and contributing to a nascent data infrastructure and excited about building a strong data foundation for the company we are looking for someone who will be scrappy independent and excited about having a big impact at new relic the products you will contribute to building will be used by your coworkers throughout the company and will help make their lives—and the lives of the customers they work with—easier and more enjoyable every day some of the problems we work on involve determining the success and revenue impact of new product features measuring and predicting the health of our customers discovering patterns in product usage that we can use to improve our products and helping sales and technical support reps communicate more effectively with their customers to expedite your application process please include a short cover letter telling us why you’re interested in this position and why you think you’d be a great fit as a good fit for this role you…have at least 4 years experience working as a software engineer in object-oriented programming languages in a production environment (python or java preferred) using cloud infrastructure have experience building data and dimensional models infrastructure and etl pipelines for reporting analytics and data science using modern workflow management tools (e g airflow and other similar tools) are proficient in sql and tuning relational databases for query performancecan comfortably evaluate pros and cons of different technologies and make tool and framework selections based on the needs of our customers and the technical requirements of the product work in short iterations to learn refine quickly and minimize risk introduced by new changes secured by an ironclad test suite actively participate in code reviews and sprint planning meetings to ensure we are all building at our best while staying aligned with business goals have strong opinions weakly held and rooted in personal experience you can easily disregard prior held biases and opinions in the face of new information and use cases enjoy mentoring others and feel passionate about improving software and data by constant iteration are a strong written and verbal communicator who expects the best of yourself and others and would rather band together for a common cause than fly solo delight in building great tools that are a joy to use bonus: you have existing expertise in distributed columnar data stores (e g redshift bigquery) responsibilities:build monitor and maintain analytics and production data etl pipelines provide the foundation for a data-driven culture by empowering other engineers and the product sales marketing and finance teams to ask questions of the dataset in an easy reliable way develop expertise in the data and own data quality for the pipelines you build enable data scientists and analysts to implement nlp and ml algorithms at scale in fault-tolerant highly available systems iteratively design and develop new systems and tools to enable team members to consume and understand data faster and more accurately work closely with data scientists and analysts across the organization to ensure the self-service tools you build will address their needs and maximize positive impact on their work integrate yourself with other engineers across new relic to learn from others and to ensure you stay up to date on the company’s engineering best practices learn and improve your skills to continuously push us to deliver higher quality data products and improve how we view the delivery of data company-wide a little about us:new relic is a leading digital intelligence company delivering full-stack visibility and analytics with more than 14 000 paid business accounts the new relic digital intelligence platform provides actionable insights to drive digital business results companies of all sizes trust new relic to monitor application and infrastructure performance so they can quickly resolve issues and improve digital customer experiences learn more at newrelic com new relic is a san francisco best places to work award winner an oregon “top workplace” award winner named a leader in the gartner’s 2012 2013 2014 2015 & 2016 “magic quadrant” for apm companies a top 100 ondemand company best of saas (thinkstrategies) top 100 coolest cloud computing (crn); 10 cloud management companies to watch (networkworld) – the list of accolades goes on more important than all of that: we provide challenging work opportunities to learn high quality teammates a standard-setting product and a company on the move our office is in the tech-rich urban center of san francisco with easy commute access and a plethora of good eats we provide competitive compensation equity and big-company benefits (medical dental etc) -- all while maintaining the energy agility and fun of a start-up we can help with relocation and are open to h1-b transfers new relic is most decidedly an equal opportunity employer we eagerly seek applicants of diverse background and hire without regard to race color gender identity religion national origin ancestry citizenship physical abilities age sexual orientation veteran status or any other characteristic protected by law note: our stewardship of the data of many thousands of customers means that a criminal background check is required to join new relic we will nonetheless consider qualified applicants with arrest and conviction records in accord with applicable law including the san francisco fair chance ordinance position: senior big data engineer cloud solutions – sr data engineer location: plano txduration: full time -perm positionsalary: $135k+benefitssenior big data engineer cloud solutions data engineer (this is senior big data engineer) we will consider someone with hadoop but missing one of the other skills (spark storm or kakfa) if they are strong in the other technologies they need to have good coding ability in java scala or python the cloud experience in aws azure or google cloud is a must have this is a perm direct hire position this is working for our big data team this is working for our big data team job description: the data services platform enables to communicate to the cloud and powers our engineering and data science teams to build efficient streaming solutions smart analytic products and dashboards we are looking for a sr data engineer in this team to make architectural decisions and contribute to the design and development of data services for the customers of this platform what you’ll do:make architectural decisions to build efficient adaptable and scalable data pipelines in a public cloud environment to process unstructured big datawork closely with product owners and other team members to deliver next generation connected car data servicesrapidly architect design prototype and implement architectures to tackle the big data needs research experiment with and utilize leading big data technologies such as spark kafka and kinesis on microsoft azure aws etc…operate in a highly-iterative agile development environmentmentor and train junior team members on current technologies and software engineering best practiceswho you are:you have 7+ years of experience with multiple programming languages and technologies with at least 3 years in the big data spaceyou are fluent in several programming languages such as python scala or java with the ability to pick up new languages and technologies quicklyyou have experience with large-scale big data technologies such as mapreduce hadoop spark hive impala or stormyou have experience in public cloud technologies such as azure aws or google cloudyou have a strong understanding of distributed systems principles including load balancing networks scaling in-memory vs disk etc…you have the ability to work efficiently in a unix linux environment along with source code management systems like git and svn - provided by dicebig data palo alto networks is the next-generation security company leading a new era in cybersecurity by safely enabling applications and preventing cyber breaches for tens of thousands of organizations worldwide built with an innovative approach and highly differentiated cyberthreat prevention capabilities our game-changing security platform delivers security far superior to legacy or point products safely enables daily business operations and protects an organization's most valuable assets if you are a motivated intelligent creative and hardworking individual who wants to contribute and make a difference this job is for you!responsibilities:responsible for designing developing and sustaining of palo alto networks' new generation machine learning based big-data service system to support our customers it is a cross-functional role working closely with malware research data scientist quality assurance and technical support teams for designing developing debugging troubleshooting and resolving issues in the software system this includes but not limited to:designing developing and maintaining new generation machine learning based big-data web page categorization data ip mining and malicious site detection systemscreating and enhancing tools to analyze and process large quantity of data setutilize your programming skills for efficient and robust implementationwork closely with malware research data science teams to enhance malicious site detection and machine learning data mining based big data systemrequirements:2+ years object oriented programming experience in a linux environment focused on server side development java preferredexperience with sql db and non-sql db mongo preferredexperience with big data technologies such as hadoop spark kafka hbaseexperience with aws products and servicesstrong analytical and communication skillsself-motivated and able to work on a dynamic environmentteam player can-do attitude and take initiative and extra responsibilityeducation:graduate degree (ms phd) in computer science engineering (preferred) or equivalent experience with bs degreelearn more about palo alto networks here and check out our fast facts company overviewligadata is a silicon valley startup that builds enterprise grade continuous decisioning solutions we use proven open source technology that is designed to solve the problems of the world's largest and most complex organizations at a high speed we leverage data science to solve real business challenges our global team is made up of data scientists big data engineers architects developers and leading experts who collectively hold over 60 patents we are focused on providing real value throughout the entire data supply chain - from back end infrastructure to state of the art analytics - for the financial services telecommunications and healthcare industries we are a fast-paced exciting and innovative company where you will have the opportunity to own your terrain influence the company's success and see your work directly impact our clients the ligadata team is bringing the power of our platform and distributed storage to companies of all sizes the heart of what we do is kamanja our continuous decisioning project we are fully dedicated to the open source movement which has allowed us to create the only offering on the market that is totally free of vendor lock-in we are a successful start-up and our dynamic work environment encourages and rewards innovators who bring outside-the-box thinking and leadership skills do you have the entrepreneurial vision and ambition to be a part of our journey?position overviewreporting into the vice president of professional services the project manager will play a strategic and hands on role the focus of the project is developing shepherding and executing the big data strategy for an international telecommunications client you must be a seasoned consultant and project manager who knows how to get work done develop relationships and extract value you must have strong knowledge and experience planning and running large scale technical projects this position will require international travel candidates can be based in miami atlanta or kingston jamaica detailed responsibilities:execute on strategic data strategy roadmaptranslate roadmap into actionable achievable project plantranslates business domain roadmap into a technology roadmap that is in alignment with the broader enterprise data architecture roadmapis the subject matter expert for the technology used in the domain to address technology integration or migration and infrastructure framework related questionswork with internal and customer resources to understand emerging technologies technical roadmaps and how it applies to the customer environmentpartner with internal and client resources to provide technical direction during project initiationguides the engagement and hadoop engineering team on the implementation of architecture related strategies concepts and toolspartners with the technical project management team to identify technical skills and training required to support new technologiesdevelop solution alternatives to meet business needs and considers the costs and risks for any trade-off decisionrequirements:tenacity energy commitment to succeedingability to work through red-tape while keeping team's focus on objectives at all timesconsulting experience (9+ years)experience managing technical delivery of medium to large data projects (about $5-15m projects)experience in telecommunications industryexcellent verbal and written communications skills including being able to present the work being done to developers project managers and senior leadersability to manage projects to meet deliverables on-time and on-budgetsuperb project management skillsability to build and work with a large virtual team across various groupsstrong knowledge of hadoop big data and other new and latest upcoming technologies ability to shape and guide the future of the data warehouse80%-100% travel *job**description:* *that?s**a**cool**job**-**i**want**it!*ready to shake things up? join us as we pursue our disruptive new vision to make machine data accessible usable and valuable to everyone we are a company filled with people who are passionate about our product and strive to deliver the best experience for our customers at splunk we?re committed to our work customers having fun and most meaningfully to each other?s success we continue to be on a tear while enjoying incredible growth year over year learn more about splunk careers and how you can become a part of our journey with growth comes scaling opportunities and the need to step up our enterprise data game our goals are high-reaching and to help us achieve them splunk is looking for an experienced data architect with in-depth hands-on knowledge of foundational data architectures such as data warehouses etls and in-memory olap models as well as experience in nosql and cloud dw implementations *responsibilities:* *i want to and can do that * * responsible for developing and supporting a data model and architecture that supports and enables the overall strategy of expanded data programs services process optimization and advanced business intelligence * partner with business domain experts system analysts data application architects and development teams to ensure data design is aligned with business strategy and direction * identify and document standard methodologies standards and architecture guidelines * establish and lead the socialization of data governance policies within the team * specify overall data architecture for all areas and domains including data acquisition data integration operational data stores master data management data warehouse data provisioning etl and bi * dive deep as required to assist data engineers and business intelligence engineers through technical hurdles impacting delivery * contribute to build standards around data governance data security privacy data quality and speed of analysis*requirements: i've already done that or have that * * b s degree in computer science mathematics statistics or a similar quantitative field * 15+ years of validated hands on work in data architecture data modelling (relational dimensional columnar) data architecture and metadata management * 7+ years experience in data modeling tools such as embarcadero erwin or er studio * solid experience with commercial etl platforms with in-depth knowledge and understanding of etl methodology & design supporting data transformations transformation layer * experience in in-memory new sql nosql databases * successful track record of cloud data warehouse implementations such as redshift or snowflake * familiarity with spark hadoop a plus * expert level skill in modelling managing scaling and performance tuning of high volume oltp olap and data warehouse environments * advanced knowledge of data management systems practices and standards * proficiency in a major programming language (e g java) and or a scripting language (perl python) * an innate desire to deliver and a solid sense of collaboration*it's great if you also have:* * strong knowledge and experience with agile scrum methodology and iterative practices in a service delivery lifecycle * experience with machine learning data science statistical techniques text mining natural language processing computational linguistics and information retrievalsplunk has been named one of san francisco bay area's "best places to work" by the san francisco business times ten years in a row we offer a highly competitive compensation package and a plethora of benefits and we take pride in the diverse and innovative culture that sets us apart as a company splunk is an equal opportunity workplace and is an affirmative action employer we are committed to equal employment opportunity regardless of race color ancestry religion sex national origin sexual orientation age citizenship marital status disability gender identity veteran or any other status prohibited by applicable national federal state or local law we also consider applicants regardless of criminal histories consistent with legal requirements *about splunk*splunk was founded to pursue a disruptive new vision: make machine data accessible usable and valuable to everyone machine data is one of the fastest growing and most complex areas of big data?generated by every component of it infrastructures applications mobile phone location data website clickstreams social data sensors rfid and much more splunk is focused specifically on the challenges and opportunity of taking massive amounts of machine data and providing powerful insights from that data it insights security insights business insights it?s what we call operational intelligence since shipping its software in 2006 splunk now has over 13 000 customers in more than 110 countries around the world these organizations are using splunk to harness the power of their machine data to deepen business and customer understanding mitigate cybersecurity risk prevent fraud improve service performance and reduce costs innovation is in our dna ? from technology to the way we do business splunk is the platform for operational intelligence!splunk has more than 2 700 global employees with headquarters in san francisco an office in san jose ca and regional headquarters in london and hong kong we?ve built a phenomenal foundation for success with a proven leadership team highly passionate employees and unique patented software we invite you to help us continue our drive to define a new industry and become part of an innovative and disruptive software company *benefits & perks: wow! this is really cool!**sf only*medical full company paid dental vision and life insurance flexible spending and dependent care accounts commuter accounts employee stock purchase plan (espp) 401(k) 3 weeks of pto sick leave stocked micro kitchens in splunk offices catered lunches on mondays catered breakfast on fridays basketball hoops ping pong arcade games bbq?s soccer ?fun fridays? pursuant to the san francisco fair chance ordinance we will consider for employment qualified applicants with arrest and conviction records *non sf*medical full company paid dental vision and life insurance flexible spending and dependent care accounts commuter accounts employee stock purchase plan (espp) 401(k) 3 weeks of pto and sick leave our work environments vary by location however we believe in hosting amenities and fun activities to fuel our energy you may find fully stocked micro kitchens catered lunches on mondays and breakfast on fridays basketball hoops ping pong arcade games bbq?s soccer and ?fun fridays? this isn?t a job ? it?s a life changer ? are you ready?individuals seeking employment at splunk are considered without regards to race religion color national origin ancestry sex gender gender identity gender expression sexual orientation marital status age physical or mental disability or medical condition (except where physical fitness is a valid occupational qualification) genetic information veteran status or any other consideration made unlawful by federal state or local laws click here to review the us department of labor?s eeo is the law notice please click here to review splunk?s affirmative action policy statement splunk does not discriminate against employees or applicants because they have inquired about discussed or disclosed their own pay or the pay of another employee or applicant please click here to review splunk?s pay transparency nondiscrimination provision splunk is also committed to providing access to all individuals who are seeking information from our website any individual using assistive technology (such as a screen reader braille reader etc ) who experiences difficulty accessing information on any part of splunk?s website should send comments to accessiblecareers@splunk com please include the nature of the accessibility problem and your e-mail or contact address if the accessibility problem involves a particular page the message should include the url of that page splunk doesn't accept unsolicited agency resumes and won't pay fees to any third-party agency or firm that doesn't have a signed agreement with splunk to check on your application click here **date:** mar 20 2018**location:** new orleans la us**company:** entergy**about entergy**entergy corporation is an integrated energy company engaged primarily in electric power production and retail distribution operations entergy owns and operates power plants with approximately 30 000 megawatts of electric generating capacity including nearly 9 000 megawatts of nuclear power entergy delivers electricity to 2 9 million utility customers in arkansas louisiana mississippi and texas entergy has annual revenues of approximately $11 billion and more than 13 000 employees learn more about our corporate utility nuclear power generation gas & transmission businesses here: http: www entergynewsroom com about-us **primary location:** louisiana-new orleans**job function** :information technology**flsa status** :professional**relocation option:** approved in accordance with the entergy guidelines**union description code** :non bargaining unit-nbu**number of openings** :1 00**req id:** 77942**travel percentage** :up to 25%**job summary purpose**this it position is a key role in the enterprise analytics team supporting the enterprise data lake ecosystem the data architect will design implement and support enterprise data lake and the development of information management solutions in support of enterprise analytics roadmap this position will work closely with business units throughout the organization to design and implement data strategies that support the democratization integration and standardization of data at an enterprise level ensuring consistency of business definitions and data quality this position requires strong technical and communication skills as well as proven experience in data management information management and the insurance industry in general this team member will be principally focused on the identification and acquisition of authoritative enterprise data and the physical mapping and data migration into the enterprise data lake in support of enterprise analytics projects and deliverables establish and manage the enterprise data lake data architecture across multiple data types (structured semi-structured and unstructured) balancing the need for access against security and performance requirements this individual focuses primarily on enterprise data requirements: design access usage also included is the development or use of process models creation or use of information or data models interface designs and development of internal and external checks and controls to ensure proper governance security and quality of information assets the primary job function of the data architect is to provide primary advanced data knowledge and data support to the enterprise analytics team?s data wrangler role to identify acquire and host enterprise data within the enterprise data lake ? in other words the key person to help fill the enterprise data lake with data as well as to help find data within the enterprise data lake the data architect works closely with the data modeler to design and build the data architecture of the enterprise data lake the data architect works closely with the etl team to help map and load data into the enterprise data lake the data architect works closely with the data governance team to help manage and govern the data within the enterprise data lake the data architect works closely with the legacy application systems teams to identify capture collect map and load authoritative enterprise operational data into the enterprise data lake this position requires strong technical and communication skills as well as proven experience in data management information management big data strategy and planning information modeling and delivery agile implementation business collaboration program management and project management and the utility industry in general this role will report directly to the it service pod manager ? enterprise analytics plus there is a matrixed relationship to the enterprise analytics team this role will directly interact with other roles such as: solution architect data modeler data wrangler and data scientist + definition refinement ownership and representation of the enterprise data lake reference architecture including:+ data supply & integration architecture tools and platforms+ analytics delivery architecture tools and platforms+ representation and alignment of the enterprise data lake reference architecture to enterprise and local analytics architecture teams+ manage and coordinate new information demand that impacts the enterprise data lake reference architecture+ capture business capability requirements functional requirements and expected service levels from business units+ establish design guidelines for software and hardware integration performance reliability operating and security designs support business units in the creation and implementation of project use cases+ maintain an understanding of technology needs and solutions for business units liaise between the business units vendors project management and technical teams on questions issues and revisions to the solution architecture**job duties responsibilities****general:** working under the direction of the it service pod manager for enterprise analytics; translate project goals into usable data architectures to guide project solution development and achieve consistency of information assets across the entire application portfolio simply stated -- responsible for the overall data architecture of the enterprise data lake ecosystem across multiple content types (structured data semi-structured data and unstructured data) -- inclusive of data management information management and analytics solutions participates in the development of enterprise analytics solution strategy and the identification and design of it architectures to support emerging business strategic intent (e g big data management and analytics) **architecture:** designs and implements the enterprise data lake data architecture working closely with the it enterprise architecture team plus other data lake and analytics teams translate the enterprise data lake reference architecture into an operational ecosystem develop manage and deploy best in practice data collection data governance storage and manipulation techniques (extract transform and load) that ultimately enable business units to unlock additional value from company-held data responsible for the overall design and build of the enterprise data lake domain across multiple content types (structured data semi-structured data and unstructured data) -- inclusive of data management information management and analytics solutions **data management technologies:** responsibilities also include the creation or use of enterprise data management processes models and technologies; data interface designs and development of internal and external checks and controls to ensure proper governance and quality of data assets inclusive of enterprise methods and standards as needed lead or participate in poc investigative and research projects **roadmap:** participate in data strategy and road map exercises data architecture definition business intelligence data warehouse product selection design and implementation**sdlc:** work through all stages of a data solution life cycle: analyze profile data create conceptual logical & physical data model designs architect and design etl reporting and analytics solutions**consulting:** provide primary advisory and consulting services for technical aspects of enterprise analytics solutions and applications (entergy?s subject matter expert on the enterprise data lake technology stack and associated data architectures) be a thought-leader at entergy for solving data quality availability issues and work with the data scientists and citizen data scientists throughout entergy various business units participates in the development of enterprise analytics solution strategy **minimum requirements**minimum **education** required of the positionbachelor's degree in related field such as business engineering or it (or equivalent work experience) mba or graduate degree in it or engineering or relevant discipline preferred minimum **experience** required of the positionminimum of 8+ years in information technology experience required experience in large scale or enterprise projects including data architecture and or analytics leadership experience (e g data architect analytics architect solution architect level experience) experience with big data electric utility and customer systems preferred minimum **knowledge skills and abilities** required of the position+ 8+ years of hands-on experience in architecture design or development of enterprise data solutions applications and integrations+ hands-on experience with enterprise data architectures and data toolsets such as: data lakes data warehouses data marts data models modeling tools data quality and profiling tools data management tools etl tool job category: sr netezza etl developerposition title: it consultant 5location: washington dc description: we are looking for a netezza developer to be part of a project team that is creating a data platform that will be used by data scientists for data discovery and statistical model creation the platform will provide a new mechanism to ingest store organize and analyze data across many different data sources and formats the results will be leveraged to support data driven decision making and process improvements the project includes enhancing a netezza based environment developing routines to populate data into the environment creating a search utility to find content within the environment and developing visualization analysis solutions clearance level: position of trust suitability requirement: please be aware that this position requires a u s government public trust suitability determination applicants who accept an offer of employment will be subject to government security investigation(s) and must receive a favorable suitability determination for access to u s government information for continued employment the government investigation for this position includes a credit check job responsibilities: · automate maintain monitor and enhance etl (extract transform and load) data processes using approved sec information technology and tool · update etl processes efficiently to support minor dataset format changes by data vendors collaborate with data vendors to support format changes · promote or facilitate promoting etl scripts code artifacts to various environments including staging test and production · provide technical recommendations to improve efficiency of data management etl process · create proactive alerts notifications to identify when data load anomalies occur · support the data branch with prescribed data access requests · perform database and datasets refreshes across systems and environments · perform database tuning including creating reviewing os stored procedures indexes partitioning tables etc · create support updates to materialized views to support users requests and inquiries · support on-loading off-loading data to from hadoop environment · create maintain database objects such as tables views indexes constraints and sql (structured query language) code and stored procedures and shell scripts · create maintain data archival process where applicable · ensure data integrity within each data set and support linkage to other appropriate data sets in a manner consistent with approved sec information technology standards and procedures this includes testing and validating any new database design against previous data structureseducation requirements: bachelor’s degreecertification requirements: n a experience skills required: · at least 8+ years of it experience· at least 3 years of working with netezza database· at least 3 years of data stage and etl experience· experience working with unix scripting· experience working with nzsql · experience working with xml data formats · bachelor's degree· excellent communications skillstravel required: n aphysical requirements: n adesired qualifications: · netezza certification· sas experienceif you feel you are qualified for this position please go to http: www salientcrgt com careers to apply salient crgt (salient) is a leading provider of information technology engineering and intelligence analysis services to agencies in the intelligence defense homeland security and cyber domains salient is proud to be an equal employment opportunity aap employer and maintains a drug-free workplace salient prohibits discrimination against employees and qualified applicants for employment on the basis of race color religion sex (including pregnancy) age disability marital status national origin veteran status or any other classification protected by applicable discrimination laws salient also participates in e-verify click here to learn about the e-verify program for more information on salient crgt please visit us at www salientcrgt com playstudios inc is a developer of engaging casual games for the world’s largest social and mobile platforms founded by a team of experienced gaming and technology entrepreneurs playstudios’ first free-to-play application myvegas combines the best elements of popular social games with established gambling mechanics players enjoy an ever-growing collection of slot and table games and the opportunity to earn an unprecedented selection of valuable real-world rewards in creating myvegas playstudios has partnered with mgm resorts international and its portfolio of the most recognized resorts in the industry including bellagio aria vdara mgm grand mandalay bay the mirage monte carlo new york-new york luxor excalibur and circus circus we’re looking for a data engineer with 2-3 years of experience to manage the pipeline of data from our products and players and into the hands of the key stakeholders and decision makers in playstudios as a data engineer you’ll be responsible for enhancing and maintaining petabyte scale data warehouse and tools around it you will do this by partnering with various cross functional teams like product management data scientists and also fellow engineering teams this person will be responsible for streamlining the multiple data tools and sources used by our team and for ensuring that data quality and consistency is maintained throughout this is an exciting opportunity to join a world-class team and bring our exciting games to even more fans worldwide our ideal candidate is bright responsible self-motivated and gets stuff done we look for problem solvers who can intuitively anticipate problems; look beyond immediate issues; and take initiative to improve both our software and our development infrastructure in short we look for people who take pride in the craft of data engineering and have proven to be great at it responsibilitiesdesign build and launch new data models and etl pipelines that will make playstudios more data drivenwork with engineering data science and product management teams to build and manage a wide variety of data setssimplify and democratize access to useable data throughout the companyability to work on multiple areas like data pipeline etl report development with tableau data modeling & design writing complex sql queries etc design and publish custom dashboards for product teams and stakeholders around the companycreate ad-hoc queries and reports for product user acquisition bi & sr managementautomate and document processesrequiredbs degree in computer science or engineering discipline with at least 2+ years work experienceproblem solver with excellent communication skills with ability to make sound complex decisions in a fast-paced technical environmentexpert knowledge in database technologies which means excellent in sql and a good understanding of trade-offs in building data modelspractical programming experience in at least one of the language like python java clojure etc strong experience in working with large data setsexperience building etl with open source tools such as talend pentahovery good experience in unix linuxcapable of planning and executing on both short-term and long-term goals individually and with the teampreferredexperience with aws tools & technologies (s3 emr kinesis etc)passionate on various technologies including but not limited to sql no sql mpp graph databases etc experience with streaming data pipelines using any of kafka aws kinesis spark streaming etc knowledge of statistics and or machine learning familiarity with columnar data storesexperience in any web framework eg: django to develop our custom bi dashboardsbenefits and perks100% health benefits coverage for you and your dependents!open creative office space with inviting outdoor patiosthree fully-stocked kitchens and large communal dining areascatered lunches throughout the weekemployee-driven entertainment happy hours and team-building events throughout the yearping pong table and game roomcommuter benefits programcell phone plan discountsflexible vacation policylocation location location! right across from caltrain and vibrant downtown burlingamefree parking and lots of ita great place to work! come by to find out for yourself!this is an exciting opportunity to join a world-class team and bring our exciting games to even more fans worldwide data architect global data & content who we arefounded and continuously led by inventor and entrepreneur tony aquila solera is a global leader in digital technologies that connect and secure life's most important assets: our cars homes and identities since its inception in 2005 as a garage-based startup solera has grown aggressively with over 50 acquisitions across its platforms the company's current product solutions include audatex autodata autopoint cap hpi colimbra digidentity enservio explore data hollander identifix inpart lynx and titletec as well as the company's flagship digital garage application today solera processes over 300 million transactions annually for approximately 235 000 partners and customers in over 80 countries unified by a strong culture that values uncommon entrepreneurial thinking and continuous "do-it-different" innovation solera's global workforce of 6 700+ associates come from diverse forward-thinking industries that include automotive technology artificial intelligence software development data sciences cybersecurity cognitive design and digital identity protection for more information please visit solera com this position (dallas tx) is with solera global data & content team headquartered in dallas tx and madrid spain our team is responsible for making sense of data and providing insights to various business groups within the company we use foundational open source technologies to move solera forward for more information go to: soleragdc com http: www solera com careers what you’ll be doingdesign prototype and integrate the big data architecture for batch and streaming use cases based on the hortonworks hadoop distributionleading the technical team by example handicrafting new architectural elements to investigate potential implementationsdesign and optimize data structure sourcing cleansing and normalization blending the best technologies with the business context and needsare you qualified? must have:proficient in linux unix environmentthe position requires at least 5 years of experience in software development2 years of experience with the hadoop ecosystemexperience with big data technologies such as hadoop stack kafka sparkdeep knowledge of java or scalaextensive knowledge of design patterns and solid principlesstrong data modelling skillsexperience with relational databases and knowledge of nosqlcontinues integration delivery deployment tools and methodologies: maven git nexus bamboo jenkins sonar strong with agile development concept and experience working in agile scrum environment highly desirable:experience with stream processingexperience with hortonworks distributionexperience with solr or elastic searchdegree in computer science or other numerical discipline benefits:relocation available for qualified candidateshighly competitive pay and health & wellness plans401ktuition reimbursementno b s policy that promotes transparency and accountabilitybeautiful and uncommon workspaces to collaborate and unwindfree gym membership (to the awesome gym that’s right next to our office)free meals healthy snacks (like nuts and yogurt parfaits) some indulgent snacks (like baked chips and dark chocolates) and refrigerators full of juices teas and other life-essential beverages (including red bull)the latest and greatest in all things technologylots and lots of awesome cars the solera waysolera’s uncommon culture is based on three simple principles: think 80 20 (focus) act 30 30 (efficiency) and live 90 10 (accountability) we define our mindset using our 3h’s: humility a hunger to succeed and a desire to hunt for opportunities to win we train our volunteers to engage with each other modulating between their intellect (iq) and emotional intelligence (eq) using our 3fs: facts finesse and force solera has become a global technology leader that is constantly growing in the double digits the principles drills and values associated with the solera way have been fundamental to solera’s success and our ability to grow continuously change and innovate are you uncommon?we’re on the hunt for an experienced big data developer who ranks in the top quartile among their peers someone who has a highly competitive and entrepreneurial mindset that is wired with a team-first attitude has no problem rolling up their sleeves to execute their missions and can modulate between leading and following as needed you will serve as the big data developer the role is based at solera offices in westlake (dallas fort worth area) this role exists within a team that develops global solutions at crestron electronics inc we build the technology that integrates technology we are proud to be the largest and most recognized brand in automation and control solutions and the premier technology partner for fortune 500 businesses globally our products’ are integrated into new high-tech commercial buildings’ to include some of the most exciting real estate throughout the world our clients include google microsoft amazon linked in and many others we are the leaders in the most exciting and opportunistic industry in the world!our automation and control solutions for homes and buildings allow our clients to control entire environments with the push of a button integrating systems such as audio visual lighting shading security building management systems and hvac to provide greater comfort convenience and security we continue to experience rapid growth as we invest in resources and create new opportunities; as a result we have exciting opportunities for a big data software engineer to join our enterprise team in rockleigh new jersey this individual will be responsible for building big data platforms infrastructure and developing big data applications in this role you will also help us to build and enhance our cloud platform with new features for our customers you will work on multi discipline projects you must be creative and thrive on solving problems *li-mt1analyze product requirements to determine feasibility of design within time and cost constraints work with discipline leads and other engineers to create module unit and interface specification implementation integration and testing of product rapidly architect design prototype and implement architectures to tackle the big data and data science needsresearch experiment and utilize leading big data methodologies such as hd insight hadoop spark azure data lake powerbi azure data factory redshift and microsoft azure paasarchitect implement and test data processing pipelines and data mining data science algorithms on a variety of hosted settings such as azure client technology stacks and crestron’s own clusters work as part of an agile team special projects as assignedbachelor’s degree is required area of study such as: computer science business information systems or other relevant field a minimum of 5 years of experience in architecting and building enterprise scale systems is required a minimum of 2 years of experience in c# net with experience with source code management systems like svn is required fluency with either agile or scrum methodologies is requiredfluency in several programming languages such as python scala or java with the ability to pick up new languages and technologies quickly experience with large-scale big data methods such as mapreduce hadoop spark hive impala or storm is requiredexperience in cloud and distributed systems principles including load balancing networks scaling in-memory vs disk etc is required ability to work efficiently under a windows or unix linux environment is required experience in azure data lake powerbi azure data factory is preferred exceptional communication skills and the ability to work collaboratively in a fast paced team environment is essential eoe m f d vbenefitsat crestron electronics we offer a competitive total compensation package including medical dental vision life insurance and short term disability 401k with company contribution paid vacation holidays and more! we have new onsite state of the art fitness and wellness centers at our headquarters in rockleigh nj must be able to work in the us without sponsorship *no solicitation* any agency submittal to any and all employees of crestron electronics inc by any method of communication will be deemed the sole property of crestron electronics inc sr big data engineerhelp build a data pipeline and platform for our customer facing products work closely with data scientists to implement descriptive and predictive analytics explore new data technologies and advise the department on best practices daily execution excellencework closely with developers data scientists and product managers to understand the questions that are being asked and how to answer thembuild highly available scalable and fault tolerant systems for batch and real-time data analysisexplore new technologies and how they might enhance our data solutionsone year and beyondbuild better tools platforms to help developers take advantage of our databuild apis to expose data and machine learning models for consumptionexpertiseif you can execute the work you can do the job that being said we realize we likely need someone with…strong programming abilityexperience in building data lake in a cloud environment experience in building real time streaming data ingestion and processing pipeline using kafka or pub sub experience with data processing tools (e g hadoop spark dataflow etc )experience building etl elt pipelines familiar with airflow ideally you will possess…ability to program in python or javaexperience with google bigquerygood math skillsfamiliarity with machine learning conceptsexperience as a team lead or other technical leader - provided by dice hadoop spark dataflow eagle ray inc is looking for database engineers to provide technical expertise for database design development implementation information storage and retrieval data flow and analysis develop relational and or object-oriented databases database parser software and database loading software project long-range requirements for database administration and design the database engineers work primarily at the front end of the lifecycle-requirements through system acceptance testing and initial operational capability (ioc) responsible for developing a database structure that fits into the overall architecture of the system under development and has to make trades among data volumes number of users logical and physical distribution response times retention rules security and domain controls develops requirements from a project’s inception to its conclusion for a particular business and information technology (it) subject matter area (i e simple to complex systems) assist with recommendations for and analysis and evaluation of systems improvements optimization development and or maintenance efforts translates a set of requirements and data into a usable document by creating or recreating ad hoc queries scripts and macros; updates existing queries creates new ones to manipulate data into a master file; and builds complex systems using queries tables open database connectivity and database storage and retrieval using cloud methodologies required skills include:0-12 years of experience bachelor's degree in math science engineering computer science or related active top secret sensitive compartmented information (ts sci) security clearance required u s citizenship required apache hadoop postgressql mysql or vmware or oracle dbms knowledgepossess knowledge of sql server and its tools including the facets of successfully administering a wide range of simple to highly complex environments experience with data and schema design and engineeringdemonstrated practical experience with data migration from legacy systems to central repositoriesindustry standard exchange schema implementation experience (e g cybox or capec)be able to evaluate and install new software releases patches and system upgrades knowledge and understanding of all aspects of database tuning: software configuration memory usage data access data manipulation sql and physical storage experience supporting technology strategy roadmap experience with development and execution of database security policies procedures and auditing-experience with database authentication methods authorization methods and data encryption techniques possess good communication skills both oral and written must work well in a team environment as well as independently must exhibit good time management skills independent decision making capability; focus on customer service ability to work with the other technical members of the team to administer and support the overall database and applications environment desired skills include: experience database engineering support to dhs dod or intelligence customersdata scientist skills and experienceunderstanding of certification and accreditation (nist 800-53) processes as they apply to database technologiesoperating system and hardware platform knowledge preferred experience working with large unstructured data setsexperience with map reduce technologiesexperience with process development and deploymenttrained in six sigma methodologyitil knowledge and certificationdesired certifications:cloudera certified professional (ccp): data scientistccdh: cloudera certified developer for apache hadoopccah: cloudera certified administrator for apache hadoopccshb: cloudera certified specialist in apache hbasecsslp certified secure software lifecycle professionaldatabase specific certifications as appropriate to positiondodi 8570 1 compliance at iat level i certification highly desired equal opportunity employer m f disability vet sexual orientation gender identity + **primary location:** united states new jersey+ **education:** bachelor's degree+ **job function:** technology+ **schedule:** full-time+ **shift:** day job+ **employee status:** regular+ **travel time:** no+ **job id:** 18010564**description**about citiciti the leading global bank has approximately 200 million customer accounts and does business in more than 160 countries and jurisdictions citi provides consumers corporations governments and institutions with a broad range of financial products and services including consumer banking and credit corporate and investment banking securities brokerage transaction services and wealth management our core activities are safeguarding assets lending money making payments and accessing the capital markets on behalf of our clients citi?s mission and value proposition at http: www citigroup com citi about mission-and-value-proposition html explain what we do and citi leadership standards at http: www citigroup com citi about leadership-standards html explain how we do it our mission is to serve as a trusted partner to our clients by responsibly providing financial services that enable growth and economic progress we strive to earn and maintain our clients? and the public?s trust by constantly adhering to the highest ethical standards and making a positive impact on the communities we serve our leadership standards is a common set of skills and expected behaviors that illustrate how our employees should work every day to be successful and strengthens our ability to execute against our strategic priorities diversity is a key business imperative and a source of strength at citi we serve clients from every walk of life every background and every origin our goal is to have our workforce reflect this same diversity at all levels citi has made it a priority to foster a culture where the best people want to work where individuals are promoted based on merit where we value and demand respect for others and where opportunities to develop are widely available to all **job description:**markets data team is building the next generation data fabric to solve for business analytics and growing regulatory needs vast amounts of data assets have been accumulated through the years data fabric built on emerging technologies will facilitate the data being inspected cleansed transformed for support decision-makingthis job involves being part of a dynamic team and contributing towards software development of core components within the next generation big data platform the ideal candidate will have an eye for building and optimizing data systems and will work closely with our systems architects data scientists and analysts to help direct the flow of data within the pipeline and ensure consistency of data delivery and utilization across multiple projects**development value** :candidate has the opportunity to be a major contributor to the citi markets data strategy and contribute towards the goal of increasing revenue using key metrics for decision making the candidate will work with bright and innovative individuals both on the business and technology side and the successful candidate can make a significant difference to the business performance **key responsibilities** :+ work with business finance technology and compliance stakeholders to ensure their data needs are met and that data quality issues originating from business processes are remediated in support of risk finance and compliance objectives + the role will work with compliance business operations and technology stakeholders to identify critical data elements define data quality rules and implement data quality measurement and monitoring+ report on data governance initiatives to senior management and chief data office and other management committees+ develop and implement data controls and reconciliations as needed to ensure accuracy and completeness of data+ implement data quality issue management and resolution processes to ensure data quality issues are identified prioritized and remediated effectively+ serve as a subject matter expert on the individual data elements used by lines of business understanding data lineage data transformation reporting and ultimately how it feeds into business decisions and actions+ provide regular performance and status reports to the required stakeholders and contribute to status meetings+ collaborate with it bi teams on required data governance it projects to ensure requirements are met and completed in a timely manner + create a process to track and monitor critical data elements to identify data issues and to provide insights on identified issues as how they might impact the data and business analysis**qualifications****qualifications** :graduate or undergraduate degree in computer science information systems or equivalent**skills** :+ 8+ years' experience of experience in data driven systems and processes with knowledge in technology data database management+ 8+ years of software development work experience in java+ strong understanding of data architecture data quality and related technologies along with data quality controls reconciliations principles+ experience of data governance data quality processes management and measurement+ experience with data reconciliations and controls required by a data platform+ experience performing root cause analysis to address identified issues+ functional domain experience and exposure to data governance and data management+ experience with data warehouse; rdbms - netezza sybase oracle; sql stored procedures & java+ data quality business process improvement experience as it pertains to compliance and data quality initiatives+ experience with development in linux environment+ strong written verbal and interpersonal communication skills+ strong process viewpoint+ cooperative problem solving mindset able to work well across multiple functional areas+ good understanding of business strategy and it landscape+ proven ability to communicate business rationale to internal and external counterparts+ knowledge of data reporting and analytic functions within a financial services organization + demonstrated ability to deal with ambiguity and to interface with internal and external resources to define requirements+ demonstrated ability to succeed in a fast-paced rapidly changing business environment + experience and understanding equites trading and regulatory obligations for cash and derivatives products+ comfortable in presenting suggestions for change and or improving work flow full-time employee opportunity in alexandria va description:big data developer will contribute directly to application development and business teams as a member of cross-functional agile development teams working alongside your development team members you will support feature development reporting and data analysis you will also work closely with the architecture team to ensure current development and analytical activities align with long-term enterprise goals you will collaborate directly with development teams on a daily basis and provide developers with database guidance and hands on data solutions data solutions focus primarily on technologies such as elasticsearch hadoop hbase and other nosql technologies however additional platforms and technologies may be used as business needs change responsibilities:assist development teams in the build out of database technologiesensure use of best practices for data-driven applications within development teamsreview and optimize access strategies for application dataenforce database quality standards within the code basebuild out and maintain reporting service and integration service applicationssupport transactional and analytical database efforts in driven by business needsrespond to data inquiries from client stakeholders and the business teamsupport architecture team in long-term planning and implementationsupport after-hours and on-call duties as requiredsupport operations and assist with troubleshooting activities as neededcontinue to develop skills in database optimization reporting data analysis and other database duties as a member of an agile software teamqualifications:you can visualize the big picture analyze key issues and recommend pragmatic solutions you can create and maintain processes in line with requirements for project goals you have strong opinions on how things should be built yet are flexible in achieving solutions you work well in an agile team environment where process and schedules can change quickly you want to make a positive impact and not just be a cog in the machine you are a smart creative proactive problem solver b s in computer science management information systems or related field5+ years’ experience as a big data nosql developerrecent experience working with elasticsearch hadoop hbase and other nosql technologiesdue to this assignment being a government contract candidates must be us citizens sensor technology the internet of things and big data analytics are some of the hottest areas in tech right now at savi technology we offer you the ability to work on all of these with three critical differentiators first you get to work with a fantastic tech stack second you get to use this technology to solve very interesting real-world problems third you get to see some of the largest companies in the world use your solutions everyday—on billions of dollars of “things”—to realize millions of dollars of value savi technology is hiring experienced engineers excited by the prospect of using streaming analytics and machine learning to enable our customers to change their entire operations of their enterprises using insights and predictions from sensor and iot data we are looking for engineers with experience in analytics and machine learning as well as engineers capable of working with streaming analytics and complex event processing we have created a dag-based architecture using kafka spark hadoop cassandra and solr that combines data ingestion transformation finite state processing aggregation indexing and immutable storage to essentially create a google-like infrastructure for sensor data (imagine building the equivalent of google analytics and ifttt for the industrial iot) we tackle fun challenges like ingestion of data in real-time—with any need of an api—and self-healing exactly-once processing at thousands of transactions per node per second (we recently converted from storm and mapreduce to spark--streaming and batch )we work in small multi-disciplined teams of product managers hardware engineers data engineers data scientists application engineers and devops professionals you will see your work in production in days or weeks—not months you will get the opportunity and flexibility to explore a wide range of technologies and challenges (you can expand into work in everything from front end tech to data engineering to firmware programming) if you believe “it’s all about the data” and are excited by the combination of a “can do” startup culture and the customer base and financials of an established company and want to use the internet of things to build solutions to tackle big real-world challenges then savi technology is right for you qualifications:bs ba in computer science computer engineering or related degree5+ years experience developing large-scale distributed data platforms and data processing solutions for paas daas platforms internet-scale companies government agencies etc experience working in an agile development environmentyou should also have significant experience in several of the following areas:data ingestion using distributed queuing technologies such as rabbitmq amqp kafka or zeromq streaming analytics and complex event processing using storm spark ibm infostreams or similar technologiesproven experience using mapreduce spark other etl elt technologies to process tbs of data daily across hundreds (or even thousands) of continuously running jobsdeep understanding of how to design high-performant data models for multiple nosql data stores (file stores wide column databases key-value stores etc )hands-on experience with hive impala presto and similar tools for sql-like exploration of large-scale data setseven better you also:enjoy designing interactive analytics solutionsare comfortable with the flexibility of high-agile environmentsprefer using continuous integration and deployment to reduce manual workare interested blurring the line between software engineering & data sciencehave experience in geospatial processing at scalesavi is an equal opportunity and affirmative action employer it is our policy to offer employment opportunity to all persons without regard to race color age national origin religion sex gender identity transgender status veteran status disability genetic information pregnancy childbirth or related medical conditions or any other status protected under applicable federal state or local law bonus points for sharing your github account with your resume global head of data quality and governancecategory: location: raleigh north carolina united statesglobal head of data quality and governancethe candidate will lead lexisnexis legal publishing business systems data governance organization the organization is empowered to drive tangible business data value through the delivery and adoption of enterprise governance data policies standards services & supporting technologies the role will work very closely with line of business teams to motivate business data ownership and improve the quality of critical data across the enterprise responsibilities:lead the enterprise data trust organization which includes accountability for:*data governance & principles*data quality control & certification*metadata management*data knowledge management*enterprise data stewardship strategy*enterprise master data management strategydefine and execute the enterprise data trust strategy (18+ month plan) with recurring milestones and appropriate expectation management with executive sponsorsfacilitate the enterprise data oversight committee with senior executive leaders to:*maintain data governance strategy*define data quality priorities*manage data certification results & enterprise scorecard*promote business data ownership*manage data policy compliance*resolve data conflicts of enterprise data issues across lines of businessenterprise data trust advocate and change championenterprise champion and escalation support for line of business data stewardship teamsdefine and promote data quality standards controls and measuresenterprise champion to develop and grow enterprise data knowledge – both employee proprietary knowledge and written documentationidentify leading capabilities within lines of business to be shared and exploited across the enterprisequalifications• bachelor’s degree or master’s degree in relevant field• 5+ years of experience leading enterprise business systems data governance quality initiatives across large complex organizations• 10+ years of data management experience such as bi reporting data stewardship data quality management metadata management data governance data process ownership and or data manufacturing• working understanding of industry best practices & technologies that effectively govern and manage data from various perspectives (data governance curation preparation stewardship analysis & or reporting)• relationship management execution & delivery• proven experience partnering with business intelligence analytical actuarial & or data science communities• proven ability to simplify technology and data concepts for business stakeholders to help drive adoption• excellent written verbal communication and presentation skills with ability to effectively communicate with senior management *big picture*actionable plan to execute• results oriented with the demonstrated ability to apply strategic and decisive problem solving skills in ambiguous situations • strong analytical critical thinking and problem solving skills• leadership• strong leadership and influencing skills at the senior management level• build talent within the team through coaching opportunities and team growth by fostering an environment that is a destination for talent • proven ability to create a high performing team that has a culture of continuous learning collaboration and is focused on business value and outcomes• build commitment and empower others communicate with clarity courage and timelinesslexisnexis legal & professional (www lexisnexis com) is a leading global provider of content and technology solutions that enable professionals in legal corporate tax government academic and non-profit organizations to make informed decisions and achieve better business outcomes as a digital pioneer the company was the first to bring legal and business information online with its lexis- and nexis- services today lexisnexis legal & professional harnesses leading-edge technology and world-class content to help professionals work in faster easier and more effective ways through close collaboration with its customers the company ensures organizations can leverage its solutions to reduce risk improve productivity increase profitability and grow their business part of relx group plc lexisnexis legal & professional serves customers in more than 100 countries with 10 000 employees worldwide lexisnexis a division of relx group is an equal opportunity employer: qualified applicants are considered for and treated during employment without regard to race color creed religion sex national origin citizenship status disability status protected veteran status age marital status sexual orientation gender identity genetic information or any other characteristic protected by law if a qualified individual with a disability or disabled veteran needs a reasonable accommodation to use or access our online system that individual should please contact 1 877 734 1938 or accommodations@relx com microsoft envisions a world where passionate innovators come to collaborate envisioning what can be and taking their careers places they simply couldn’t anywhere else this is a world of more possibility more innovation more openness and sky’s the limit thinking; a cloud enabled world microsoft technology centers (mtc) are collaborative environments that provide the place experience and expertise to help our enterprise customers achieve their digital transformation by leveraging the power of microsoft cloud solutions and services we live microsoft’s mission to empower every person and every organization on the plant to achieve more with a focus on our three bold ambitions reinvent productivity & business processes build the intelligent cloud platform create more personal computing mtcs engage early in the project life-cycle where the customer is just beginning to define the vision imagine what is possible and determine if it’s feasible the role requires an affinity for evangelism and competitive engagement the ability to win hearts and minds and to explain abstract technical concepts in a business value context is a must the microsoft technology center (mtc) philadelphia located in malvern pa is looking for a data & ai architect with deep technical hands-on experience in designing building and tuning consolidating and managing complex data solutions while exposure to the broad sql platform is desired the emphasis of this role is increasingly on business insight (bi) solutions open source platforms (hadoop) azure data services and data warehouses **responsibilities**as a mtc architect you will be a senior technical solutions leader and accelerate customers’ digital transformation journey through microsoft big data advanced analytics and artificial intelligence you will deliver strategy briefings architecture design sessions and proofs of concept (poc) to articulate the business opportunity propose and validate solution architecture and transfer knowledge on best practices to data scientists business decision makers (bdms) and app dev leads primary accountabilities include influenced revenue & pipeline customer satisfaction customer facing time and contribution to design delivery of innovative solutions that can be showcased at philadelphia mtc and beyond **qualifications**the position requires deep technical expertise in the following areas:+ analyzing large data sets creating actionable insights and visualization by leveraging microsoft business intelligence capabilities including in-depth hands-on knowledge of warehouse architecture analytics reporting tools and sql operational expertise + experience with hadoop big data and sql-based appliances; preferred expertise with python or r deep focus on data iot related azure services including analytics data + storage (sql database document db mysql database sql data warehouse sql server stretch database azure search) intelligence hybrid integration compute (sql vms oracle vms biztalk vms) building data models for business intelligence including competitive bi technologies e g oracle tableau cognos + strong software development skills are a must using both common microsoft tools and common open source platforms + high-level expertise in sophisticated identity authentication security privacy and compliance requirements and experience integrating them into cloud and hybrid solutions required + industry leading experience with cloud hybrid infrastructures networking and adjacent technologies architecture designs migrations and industry standards + deep understanding of cloud computing technologies business drivers and emerging computing trends including machine learning and cognitive services + certification in domain-specific (microsoft or competitive) technologies required + certification in the following technologies preferred: cloud application development technologies (including oss technologies) and azure architecture and development exams (70-532 and or 70-534)the successful candidate will also:+ have demonstrated experience and competency in customer engagement and collaboration + have superior presentation skills and experience engaging with senior level executives + be able to communicate complex concepts in a simple business value context + be a self-starter who takes ownership of opportunities; works independently manages multiple simultaneous projects and deals well with ambiguity and last-minute changes + have the ability and passion necessary to maintain technical excellence with emerging technologies including competitive technologies while continuing to manage customer opportunities + have a commitment to customer and partner satisfaction including internal customers strong listening communication and presentation skills and the ability to thrive in a competitive team environment 10+ years of success in consultative complex technical sales and deployment projects participate in internal microsoft technical communities and in the broader industry events and publish blogs whitepapers reference architecture etc in area of expertise bachelor’s degree in computer science information technology or related field required or equivalent work experience combined with a minimum of 7 years of relevant work experience o have experience associated with the energy industry modest travel (15%) outside of the mtc is required microsoft is an equal opportunity employer all qualified applicants will receive consideration for employment without regard to age ancestry color family or medical care leave gender identity or expression genetic information marital status medical condition national origin physical or mental disability political affiliation protected veteran status race religion sex (including pregnancy) sexual orientation or any other characteristic protected by applicable laws regulations and ordinances if you need assistance and or a reasonable accommodation due to a disability during the application or the recruiting process please send a request to askstaff@microsoft com needed: data architect with expertise in postgres and nosql database like mongo db or cassandra; 8 months+; chicago il 60601; skirkwood@htpartners com horizon technology partners has an immediate opening for a sr data architect with experience designing architectures for postgres and nosql database like mongo db or cassandra you are needed to join a project that is kicking off with kraft heinz that will design and bring to market a personalized meal planning system leveraging the kraft heinz nutritional and recipe library and media publications you will be responsible for building an enterprise solution that optimizes and analyzes large-scale data to create business value after architecting the environment in the project this role will transition into the lead database administrator and analyst post- production as a member of the team you will collaborate with data engineers analysts software engineers and product owners to design and implement data architecture and a framework required for personalized meal planning this role is analytical and creative and will be responsible for taking the experience from concept to launch location duration:- work location(s) – based out of downtown chicago (aon center)- duration of assignment – 4 2 to 1 09 19- typical anticipated hours per week schedule - 40 principal accountabilities:- architect robust data solutions to ingest catalog and analyze high-volume high-frequency data in real time to generate business insights - design and build optimal data engineering processes and frameworks considering best practices around efficiency data integrity scalability and maintainability - create optimized workflows and design specification documents to help define data platform features and slas - develop rapid prototypes and proofs of concept to help assess strategic opportunities and future data product capabilities - experience working with api and ingestion of external data - using sound scientific guiding principles in analysis model training testing and validation of the data models to create precise high performing and reliable models to be used in product- using software principles to write functional scalable tested and clean deployable code during the implementation stage of algorithms- help solve challenging problems with problem formulation prioritization exploration and implementation- collaborate and work closely with engineering product and design to create high quality reliable products- responsible for the management and maintenance of databases reports and portals- create develop establish standardized best practice reporting and take on responsibility for report updates both weekly and monthly- develop and deploy tracking and measuring tools to allow for identification of areas of opportunity experience and knowledge:- 5+ years of full-time experience in it related fields- minimum of bachelor degree preferably in mis or equivalent it field experience required - 3+ years data engineering experience supporting high-volume high-velocity data streams - strong background in architecting relational databases like postgres and nosql database like mongo db or cassandra (preferred)- ability to write sql queries and use tools such as hadoop tableau (nice to have) and other data reporting tools experience in transactional and data warehouse environments using mysql hive or other database systems must deeply understand joins subqueries window functions etc - ability to use containers like dockers or kubernettes (nice to have)- experience working in a team of programmers as data scientists- experience designing and architecting data solution in aws azure or google cloud (preferred)- ability to architect a data lake is a plus- awareness of automated testing and continuous integration processes related to data engineering- hands-on knowledge of scripting languages ( shell scripts) on linux platforms to perform basic tasks - strong machine learning and statistics track record and expertise on regression methods classification methods clustering neural networks unsupervised learning methods etc (nice to have)- published apps websites or other examples of solutions built skills needed:- excellent verbal and visual communicator- effective team player with minimal supervision and effectively meet project deadlines in an agile environment- able to effectively navigate white space ambiguity and possess creative problem solving- a desire to be accountable for owning problems from design to implementation- an ability to evangelize data models to developers and analysts- flexibility to adapt to multiple standards based on the use case and technology- a bias for action and pragmatic solutions- a low ego and humility; an ability to gain trust through communication and doing what you say you will do - provided by dice sr data architect - nosql (mongo db or cassandra) sql hadoop tableau dockers kubernettes mysql or hive cloud (aws or azure or google cloud) linux shell scripting technical lead data pipeline engineeringupwork is the world’s largest freelancing website each year over $1 billion of work happens through upwork helping businesses get more done and freelancers work anytime anywhere on projects they love at upwork you’ll help build on this momentum together we’ll create economic and social value on a global scale providing a trusted online workplace for professionals to connect collaborate and succeed upwork is looking for a leader leading our data pipeline team to scale and build our next generation data collection infrastructure and capabilities if you've ever desired to lead a platform team that process transactions at 10 digit scale from ground up for the future of work this job is for you we are a two-sides on-demand platform company relying heavily on open source technologies as the technical lead you will partner with technical product managers product development engineers data scientists and business intelligence community to establish the vision and turn that vision into reality your responsibilities:develop data pipeline engineering roadmap for upwork’s data infrastructure and management platform including event logging data streaming batch etl processes in data warehouse partner with user interface engineering team defining standards set of web analytic data collection apis and methodologies gather and process raw data at scale including writing scripts web scraping calling apis write sql queries etc work with product engineering team to integrate the data collection instrumentation and real-time data stream processing into our the production systems process unstructured data into a form suitable for analysis define and execute integration with various cloud-based web analytics vendors such as google analytics site catalyst etc evaluate 3rd party and open source data processing and management technologies support business decisions with ad hoc analysis as needed what it takes to catch our eye:technical leadership experience leading an engineering team of 3+ in a medium-to-large size internet company experience in open source sql and non-sql databases like postgres mongodb dynamodb etc programming experience in java python or ruby ideally as full-stack engineer that enables you to collaborate with product development engineers and understand the nuances of data collection and data pipeline issues & solutions how to really knock our socks off:data architecture experience building cloud-base or on-premise data infrastructure management platform with big data technologies like hadoop and spark real-time data streaming service like aws kinesis kafka zookeeper and aws data pipeline emr and other data replication technologies system integration and programming experience building and maintaining a web analytics tracking system experience processing large amounts of structured and unstructured data hands on sql and analytic (google analytics domo etc) experience to support our data science and business analytics communities experience integrating cloud-based data pipeline and analytics services such as aws kinesis aws data pipeline emr google tag management google analytics etc strong analytical mindset and collaborative personality working knowledge in data mining machine learning natural language processing or information retrieval is a plus spancome change how the world works we are an equal opportunity employer at upwork you’ll make a real difference in the world rolling up your sleeves to help reinvent how the world works as a major player in one of the most exciting and fastest growing on-demand talent marketplaces you’ll join a team that’s leading the charge in online work that empower businesses and freelancers alike in short you’ll change lives which will do wonders for your career self-esteem and karma along the way you’ll enjoy a bona-fide work-life balance we practice what we preach with all of the perks you expect from technology leaders in the valley - free breakfast and lunch happy hours team outings and the like plus every wednesday is a work from home day for many teams although our main offices in mountain view san francisco chicago and oslo are warm and inviting naturally you’ll also draw a nice paycheck and enjoy some meaty benefits so join us as to change the future of work trident consulting is seeking big data engineer for one of our clients - “a global leader in business and technology services title: big data engineer location: charlotte north carolinatype: contract responsibilities: master's degree in statistical analytics data science or actuarial science and at least 1-2 years of relevant work experience a bachelor's degree will be considered with additional relevant work experience the resources must have domain technical experience in delivering data engineering solutions using data lake technology and experience working with our it support team experience with the following: hadoop (cdh) relational databases and sql etl development spark data validation and testing (data warehousing etl elt to the data lake using the data lake for data analysis (hadoop tools hive impala pig sqoop hue kafka etc python r java docker dakota) knowledge of cloud platform implementation (azure or amazon) knowledge of data visualization tools is also a plus (axa us uses tableau on multiple platforms along with python visualization in the data lake using pandas and bokeh packages)experience with collaborative development workflows (e g microsoft vsts)relevant technical skills include applied mathematics statistics calculus quantitative or statistical methods or techniques data mining informatics machine learning data science programming computational algorithms databases artificial intelligence natural language processing bayesian inference markov logic java software engineering and or systems design analysis excellent written verbal and interpersonal skills a must as there will be significant collaboration with the business and it for more details contact antony at 925-215-1143 email: antony@tridentconsultinginc com trident consulting handles the staffing and management of part or all of the recruitment process for our customers wishing to outsource their staffing requirements from job profiling providing new staff technology to on boarding a new hire we support our customers in their future business needs - provided by dice bigdata etl spark hive python data lake we have following urgent role with our direct clienttitle: sr engineer-back-endlocation: san bruno ca 94066duration: 6+ monthsrate: market must haves:- built a datamart leveraging storing web analytics ad logs reporting data through on-premise and cloud infrastructure for real-time dashboards - expertise in azure hadoop presto spark- built data etl pipelines manage dev ops and db environments to meet 99 5% slas- integrated & stored highly transactional data from apis and provided the data into bi visualization tools (e g domo) pluses:- architecture of building highly scalable and transactional infrastructure & open source toolkits- front-end experiences and knowledge of integration with tools like thoughtspot looker - api expertise with 3rd party partners like facebook- experience in working with highly sensitive and sensitive datasets and familiarity with architecture must-haves to be privacy & security compliantresponsibilities:build scalable high- performance and efficient pipelines and workflows that are capable of processing billions of transactions and real-time customer activities work with big data and provide to our data scientists the right tools data marts and rollups to build their machine learning models fluent in pig and or hive with experience in building udfs pig and hadoop streaming build automated reports that can help the team to proactively identify quality and or coverage problems in releases or new versions of our models apply knowledge of azkaban oozie or hamake for workflow management and job scheduling provide senior leadership and demonstrable programming expertise and proficiency in java c c++ or python work on data warehousing architecture and data modeling best practices qualifications:bachelor of science degree or equivalent in computer science computer engineering electrical engineering or a related field plus 7-10 years of software engineering experience at a senior level; or a master’s degree or higher with 5-7 years of senior software engineering experience required:must have demonstrable programming proficiency in one or more of the following: java c c++ or python deep understanding of map reduce framework & hadoop fluent in pig and or hive with experience in building udfs strong scripting ability proven expertise and understanding of etl techniques knowledge of azkaban oozie or hamake for workflow management and job scheduling must be team oriented and collaborative to interact with both managers and cross functional teams ability to thrive in a fast paced environment on multiple projects in various phases and under tight deadlines zendesk is a fast growing cloud based software company utilizing standard saas based applications to support its business operations (google apps & gcp salesforce zuora netsuite workday slack etc + our own instances of zendesk of course!) with growth comes increased complexity of customers products and deals as well as the need for more advanced analytics for business insights we're looking for a seasoned saas data architect who has broad experience and a passion for what they do to drive our continued innovation in this space this role leads data architecture at zendesk supports the enterprise's databases and helps drive the data analytics strategies and platforms as a strategic and tactical leader at zendesk this individual impacts multiple technically complex mission-critical and or high-profile initiatives simultaneously and must possess executive leadership in combination with an extensive deep technical understanding of data architecture big data application databases and integration design must also have deep knowledge of master data management reference data data integration metadata data standards this position leads the team that is currently building and owns the central repository for all data within zendesk and serves the needs of our internal customers business units and product managers what you get to do everyday:develop and own the cloud-based data and analytics platform that will define and change the way zendesk works with dataensure the new data and analytics ecosystem supports reporting and operational needs as well as advanced analytics across the companybe a strategic leader driving the vision building relationships and proactively collaborating with key business and technical stakeholders at all levels throughout the organization also roll up your sleeves and be a hands-on contributing member of the data platform team; write code conduct code review develop & mentor engineers solution for both process and technical challengesown the technical roadmap and alignment to strategic objectives set by executive teamown the tactical roadmap; run daily stand-ups facilitate communication between platform team and stakeholdersdevelop zendesk's approach to master data management and data stewardshipcollaborate with analytics lead to enable the analyst community and deliver value rapidly and iteratively to the enterprisepartner with security & compliance teams to ensure we deploy policies technologies and controls that protect our data applications and infrastructurecontinually enhance and enrich the data platform from data ingestion through reporting to data science insightswhat you bring to the role:10+ years experience in business intelligence data warehousing and analyticsstrong technical background; bachelor's degree required graduate degree preferredgreat team player who can work in a collaborative and iterative environmentproven ability to build manage and lead analytic engineering teamsexcellent communication and presentation skills with stakeholders at all levels of the organizationproven ability to architect design and build analytics platforms at an enterprise levelability to translate long term strategic objectives into technical roadmap and tactical implementation planstrong programming and sql skills across multiple platformsmodern cloud architecture experience involving analytic systemsfinance experience preferredprofessional services consulting experience preferreddata science experience preferredzendesk builds software for better customer relationships it empowers organizations to improve customer engagement and better understand their customers zendesk products are easy to use and implement they give organizations the flexibility to move quickly focus on innovation and scale with their growth based in san francisco zendesk has operations in the united states europe asia australia and south america learn more at www zendesk com individuals seeking employment at zendesk are considered without regards to race color religion national origin age sex marital status ancestry physical or mental disability veteran status or sexual orientation #li-ae1 i need a solutions architect with a data management focus with healthcare health insurance background person should have a degree in computer science engineering or related field with more than 10 years experience and able to do internal and external consulting type work as well as participate in various data strategy monitzation efforts resource must have worked with data scientists hadoop spark hive cloudera hortonworks experience in ab initio a big plus the mission:our daily fight with cyber bad guys requires us to collect and analyze a lot of data a lot of data! and as our customer base continues its rapid growth we must look at faster and more robust tools to help us and our customers make the best decisions possible as an expert with hadoop and big data technologies you will add your tools-building superpowers to a small team tasked with building out a devops automation environment one that will step up our business intelligence game and help us protect our customers from cyber intruderswe offer the chance to be part of an important mission: ending breaches and protecting our way of digital life if you are a motivated intelligent creative and hardworking individual then this job is for you!the job:as a big data engineer you will be an integral member of our big data & analytics team responsible for design and developmentpartner with data analyst product owners and data scientists to better understand requirements finding bottlenecks resolutions etc design and develop different architectural models for our scalable data processing as well as scalable data storagebuild data pipelines and etl using heterogeneous sourcesyou will builddata ingestion from various source systems to hadoop using kafka flume sqoop spark streaming etc you will transform data using data mapping and data processing capabilities like mapreduce spark sqlyou will be responsible to ensure that the platform goes through continuous integration (ci) and continuous deployment (cd) with devops automationexpands and grows data platform capabilities to solve new data problems and challengessupports big data and batch real time analytical solutions leveraging transformational technologies like apache beamyou will have the ability to research and assess open source technologies and components to recommend and integrate into the design and implementationyou will work with development and qa teams to design ingestion pipelines integration apis and provide hadoop ecosystem servicesthe skills:4+ years of experience with the hadoop ecosystem and big data technologiesability to dynamically adapt to conventional big-data frameworks and tools with the use-cases required by the projecthands-on experience with the hadoop eco-system (hdfs mapreduce hbase hive impala spark kafka kudu solr)experience with building stream-processing systems using solutions such as spark-streaming storm or flink etcexperience in other open-sources like druid elastic search logstash etc is a plusknowledge of design strategies for developing scalable resilient always-on data lakeexperience in agile(scrum) development methodologystrong development automation skills must be very comfortable with reading and writing scala python or java code excellent inter-personal and teamwork skillscan-do attitude on problem solving quality and ability to executedegree in bachelor of science in computer science or equivalentlearn more about palo alto networks here and check out our fast facts senior data engineerthe business intelligence team within labs @ ef tours is responsible for consuming all data created by the business and preparing it in a way that can be easily digested by analyst teams and data scientists we work with we have been anchored in microsoft technologies since inception and having recently migrated to aws we have the opportunity to explore new technologies available to us we are looking to expand our team in order to execute faster on a data engineering strategy and to have the capacity to explore and adopt new technologies a data engineer on our team will be involved in maintaining and improving the way we move and store data in an effort to transform business intelligence across ef tours responsibilities:build and maintain datamarts using kimball data warehousing methodologywrite and tune t-sql using all available sql server objects for the purpose of building and maintaining data martsmonitor and maintain system health and securitycreate data feeds using ssis sql server replication and additional non-sql toolsdesign architecture based on current and future technology landscapesoversee administration and improvements to source control and deployment processmonitor performance and advise any necessary infrastructure changesprepare unit tests for all work to be released to our live environment (including data validation scripts for data sets releases or changes)implement performance tuning on the databases based on monitoringrequirements:bs in computer science engineering or related field5+ years of professional development experienceadvanced experience with data modeling (ability to tailor design to specific need)strong transact-sql (t-sql) skillssql server 2014+ experience (2016 preferred)sql server integration services experiencehighly organized and commented coding styleproven ability to understand business concepts and relate them to dataexperience integrating with data visualization tools for mass consumption (including qlikview powerbi)powershell experience preferredexposure to agile methology and scrum a pluswillingness and ability to learn new technologies and methodologiesexcellent written and spoken englishef education first is a global leader in international education with a 50-year mission of opening the world through education our programs worldwide provide opportunities for language learning educational travel cultural exchange and academic study for customers of all backgrounds and ages helping them to become better citizens of the world a career with ef combines the support and opportunity of a large organization with the spirit and energy of a small company we look for thinkers and doers – creative collaborative and motivated people who are excited by education communication and travel software guidance & assistance inc (sga) is searching for a data wrangler for a contract assignment with one of our premier clients in white plains ny responsibilities : wrangle data (a k a data engineering) from a wide variety of complex data sets large volumes of data into normalized and enriched data sets assemble and evaluate data such that new insights solutions and visualizations can be derived this individual will develop a deep understanding of analytical needs in order to enable the company data assets big data technologies keying and linking concepts to analyze internal and external utility industry data to create insights perform data discoveries to understand data formats source systems file sizes etc this will include engaging with internal and external business partners in the discovery process deliver data acquisition transformations normalization mapping and loading of data into a variety of data models clear understanding of and the ability to navigate file structures such as csv files excel spreadsheets relational data bases sap business warehouse cubes to locate and extract data ability to design and architect method of extracting data from its original source to another location (i e a data lake) to make that data consumable for a variety of downstream purposes such as applications visualizations analytics and other related data science activities basic understanding of data science and the ability to support a data science team ability to transforming and map one "raw " data form into another format with the intent of making it more appropriate and valuable for a variety of downstream purposes such as analytics interact with all levels of people including data stewards analytics professionals it sales and customers engineers and power plant personnel interact with customer business technologists in order to satisfy customer needs assist in the development of etl migration plans for all internal and external data assets not currently integrated into an internal data lake including developing a level of subject matter expertise with each new data asset being loaded lead and execute activities to support all aspects bringing new data sources into nypa's data lake environment collaborate extensively with technology to design data ingestion data models and automated operational metrics for consistent high quality data loading into the company data platform develop strategies standards and best practices in the areas of data wrangling data visualization and data integration and lead adoption throughout the company ensure quality coverage and accuracy of data validate data quality content in conjunction with respective data stewards by developing reports and tools to monitor and visualize data develop and collaborate extensively with data analytics and technology leads to ensure the seamless consumption of insights generated from the company emerging big data analytical platform required skills : expert level proficiency in sql expert level proficiency in microsoft excel experience working independently with minimal guidance strong problem solving and troubleshooting skills with experience exercising mature judgment proficiency in multiple programming languages comfortable working with open-source technologies agile methodologies scrum xp particularly with more than one team within an organization iterative project delivery scrum experience experience in multiple roles of an agile team experience of coaching teams experience in workshop facilitation experience in software process improvements ability to work directly with highly experienced user community must have effective collaboration skills exceptional ability to clearly and effectively communicate with it peers and users at all levels of technical and business acumen strong time management skills and multi-tasking capabilities ability to understand business rules governing financial and budgeting process an understanding of technical concepts principles and terminology exceptional oral and written communication skills self-motivated ability to set priorities and meet deadlines ability to handle multiple tasks assignments projects simultaneously strong attention to detail ability to clearly and correctly state and summarize technical information ability to present finds to both technical and non- technical audiences experience with etl and or other big data processes ba in computer science preferred skills : knowledge of other agile methodologies such as crystal dsdm atern rup is a plus experience in evangelizing agile is a plus sga is a certified women's business enterprise (wbe) celebrating over thirty years of service to our national client base for both permanent placement and consulting opportunities for consulting positions we offer a variety of benefit options including but not limited to health & dental insurance paid vacation timely payment via direct deposit sga accepts transfers of h1 sponsorship for most contracting roles we are unable to sponsor for right-to-hire fulltime or government roles all parties authorized to work in the us are encouraged to apply for all roles only those authorized to work for government entities will be considered for government roles please inquire about our referral program if you would like to submit a candidate for any of our open or future job opportunities sga is an eeo employer we encourage veterans to apply to view all of our available job postings and or to learn more about sga please visit us online at www sgainc com - provided by dice agile automated consulting development excel http inquire iterative management metrics microsoft excel programming project sap sales scrum sql NA have you ever had the opportunity to impact the lives of millions of people in a meaningful way and help them enjoy time away with their friends and families building memories?that is what we do here at homeaway com we are the leading vacation rental website in the world with more than 1 million online bookable listings as part of the digital platform engineering team you will have a unique opportunity to develop software solutions that capture data in real-time from a variety of sources and make data available programmatically to multiple stakeholders ranging from data scientists to business teams to end customers you will be working on cutting-edge technologies to solve problems of scale and speed technologies we use:aws mesos drop wizard elastic search cassandra hbase kafka hadoop spark samza confluent cascading java8requirements:- bs or ms in computer science or equivalent work experience- 3+ years of professional java c# software development experience- experience with all aspects of data systems (both big data and traditional) including database design etl aggregation strategy performance optimization- experience developing applications on big data streaming platformsresponsibilities:- develop techniques to process and analyze events on a real-time streaming platform- work closely with marketing to implement and expand data connections for email marketing purposes - design and implement high-performance data models in nosql database technologies - develop and configure hadoop big data components (e g oozie cascading spark)- develop quality scalable tested and reliable data services using industry best practices- create and maintain quality software using best-in-class tools: git splunk new relic sonar and teamcitybenefits:- competitive health and insurance benefits- competitive salary- annual target bonus or commission- paid vacation and sick time- vacation rental on a yearly basis (taxable benefit)- employee stock purchase program- free snacks and beverages- frequent company update talks with our leadership team- free listing on homeaway com- electric adjustable stand-up desk- discounted metro & rail pass- casual dress code hi we’re tivo at our core we’re innovators who continuously seek to fuel the ultimate entertainment experience we touch the lives of binge-watching music-loving entertainment fanatics every day by inventing and delivering beautiful user experiences and enable the world’s leading media and entertainment providers to nurture more meaningful relationships with their audiences position description:the principal big data engineer will analyze design program debug and modify software enhancements and or new products lead development of both transactional and data warehouse designs with our team of big data engineers and data scientists design implement and tune tables queries stored procedures and indexes work in an agile scrum driven environment to deliver new and innovative products for analytics customers; and keep up-to-date with relevant technology in order to maintain and improve functionality for authored applications education required: bachelor’s degree or foreign equivalent in electronic engineering computer science or related fieldexperience required: 10+ years of progressive experience as a systems software engineer application developer or related occupationspecial requirements: must have at least 4 years of prior work experience in the following:programming in java and python writing complex sql and etl batch processes working with large data volumes including processing transforming and transporting large scale data using big data stack: m r hive sql spark etcdata warehousing and analytic architecture implementation on a major rdbms including at least one of the following: oracle mysql and or sqlserver amazon web services including at least one of the following: on demand computing s3 and or equivalent cloud computing approach building custom data loads using scripting language python or shell script experience with big data stack of technologies including hadoop hdfs hive and hbase senior cloud data architectlocation: owings mills mdthe senior cloud data architect identifies researches and implements leading edge technologies and practices this architect has a depth and breadth of professional experience in data technologies processes and practices and related areas develops reference architectures for data acquisition storage and distribution data products solution evaluations business intelligence data warehousing metadata management data quality management data modeling and mdmsteers strategic technology direction defines target state architecture technology roadmaps builds reference implementations and mentor others while championing and maturing the enterprise architecture practicesbuilds and maintains the enterprise information model and articulate the data produced and consumed by the enterpriseprovides architecture guidance on the identification definition build-out and operationalization of the strategic data entitiesensures that the enterprise follows enterprise data standards and practices and they align with the data policy governance and security requirementsformulates consistent data integration and distribution patterns and practices socializes and implements to ensure integrity of the strategic dataarchitects and follows through the implementation of the enterprise data lake on aws technologies to better manage the data lifecycle and enables next generation data analytics and operational use casesprepares research presentations whitepapers proposals and sample applications that demonstrate how technology can affect often complex systems and increase the effectiveness of the firmpartners with the channel manager and the business unit leaders and managers to define and review technological strategiesprovides technical oversight to the implementation of technical strategyrequired:10+ years’ of hands-on technical experience in an architecture roleexperience architecting & managing data integration & data warehouse platforms on cloud solutions such as awsdata modeling & solution design experience in oltp & olap environmentsstrong analytics & reporting skills – experience with well-known bi data blending data virtualization & reporting toolsstrong data integration skills and experience especially around moving large data sets in batch & near real time across the cloudexpertise in building internet scale solutions in the cloudhands-on experience with traditional rdbms platforms like oracle db2 & ms-sqlhands-on sql & procedural sql coding skills – past performance tuning of queries and data storesmust have a thorough understanding of agile development methodologies preferred:advanced degree in technical engineering or related quantitative fieldhands-on experience with nosql databases like cassandra & dynamo & big data solutions like hadoop hive pig etc additional exposure to machine learning & data science techniquesdemonstrated successful experience implementing technology solutions resulting in significant impact to the businessaws associate solution architect certificationprevious experience as a data engineer or data scientistprevious experience in architecting and developing a commercial productexperience with web and enterprise content management tools and web services soa - provided by dice cloud aws sql oracle about:want to be part of a company which has pioneered digital promotions and advertising solutions driven by data we innovate every day to connect shoppers retailers and thousands of brands to provide world class shopping experience quotient technology inc is a leading data-driven digital promotions and media company that connects brands retailers and shoppers quotient insights platform combines exclusive shopper data from select retailers with unique shopper insights and behaviors fromcoupons comand thousands of publishing partners within our network through this platform we will be able to bring to market enhanced targeting and measurement capabilities across promotions and media providing this level of insight requires cutting edge technologies in many areas including big data we are currently seeking hands on senior data infrastructure architect specializing in big data technologies specifically hive hbase sqoop map reduce yarn zookeeper oozie our team provides 24x7 operational support of the hadoop infrastructure that powers many of our most innovative products and services the ideal candidate will demonstrate strong technical aptitude experience in supporting big data environments and solid customer support skills we look for people with the skills and passion to thrive at a company where innovation is the norm collaboration is paramount and fun is always part of the mix overview:as part of the data infrastructure team you will play a key role in managing the hadoop infrastructure and associated support systems currently this environment is at nascent phase scaling to tens of petabytes in size with hadoop clusters ranging around 100 - 150 nodes across multiple data centers the demands on this system is growing steadily as we increase the data ingestions from retailers and add more functionality and products around the data you will be responsible for the performance improvements and optimization of our hadoop ecosystem components working closely with the engineering and product teams you'll be taking on interesting and complex challenges to scale the current infrastructure setups for production staging and disaster recovery cluster(s) a successful candidate for this position will be self-motivated with an attitude of getting things done should be able to see the big picture as well be able to deep-dive into details solving complex problems responsibilities:own and maintain operational best practices for smooth operation of large hadoop clusters apply in-depth analysis of hadoop-based workload project-based work devise solutions and evaluate their effectiveness optimize and tune the hadoop environment(s) to meet the performance requirements partner with hadoop developers in building best practices for warehouse and analytics environments investigate emerging technologies in the hadoop ecosystem that relate to our needs and implement those technologies required skills:3+ years plus of hands-on experience in deploying and administering hadoop clusters strong problem solving and troubleshooting skills good linux administration and troubleshooting skills good understanding of hadoop design principles and the factors that affect distributed system performance working knowledge in administering and managing hadoop clusters - preferably in cloudera installations experience in troubleshooting on big-data technologies like hdfs mongodb spark oozie kafka zookeeper etc good scripting experience with at least two of the following: shell python ruby or perl preferable experience with cloud services and computing knowledge in metric collection for monitoring and alerting bs ms degree in computer science or big data data science related fields ability to prioritize work and establish consistent responsiveness to customer inquiries - with strong focus on service delivery qualifications:bachelor's degree in computer science or related field5+ years progressive technology experience including experience in the following:advanced big data hadoop infrastructure experience exposure to software engineering and development including experience writing scripts and understanding code intermediate unix linux administration experience ability to work effectively with cross-functional and cross-cultural teams ability to investigate complex issues spanning multiple technologies experience working with offshore teams and providing strategic guidance to the teams company descriptioncommon purpose uncommon opportunity everyone at visa works with one goal in mind – making sure that visa is the best way to pay and be paid for everyone everywhere this is our global vision and the common purpose that unites the entire visa team as a global payments technology company tech is at the heart of what we do cybersource a visa company has been and continues to be a pioneer within the e-commerce payment management world our visanet network is capable of handling over 65 000 transaction messages per second for people and businesses around the world enabling them to use digital currency instead of cash and checks we are also global advocates for financial inclusion working with partners around the world to help those who lack access to financial services join the global economy visa’s sponsorships including the olympics and fifa™ world cup celebrate teamwork diversity and excellence throughout the world if you have a passion to make a difference in the lives of people around the world visa offers an uncommon opportunity to build a strong thriving career visa is fueled by our team of talented employees who continuously raise the bar on delivering the convenience and security of digital currency to people all over the world join our team and find out how visa is everywhere you want to be job descriptionvisa processes 100s of millions of payment transactions per day and has one of the world’s most valuable data assets the visa data product development organization (dpd) has the mission of "unleashing the power of data to drive global commerce" the data platform team is responsible for the horizontal data services that drive visa's data and information products including risk and authorization loyalty and commercial products the merchant data platform team builds visa's merchant repository and merchant applications that power critical visa applications and business analytics as a lead you will strategize define develop and manage architecture and solutions for merchant data publish and merchant data quality you will act as a technical expert and leader providing guidance and support to development teams superior organization written communication and verbal communication skills are required key responsibilities:responsible for the delivery of software solutions for publishing merchant information via apis on visa developer platform (vdp) merchant reference tables in relational and non-relational stores merchant profile services as part of visa's data platformestablishes software development patterns and best practices via examples and shipping codeensures that your development teams follow a common set of principles and patterns and utilize a standard set of technology frameworks and librariesguide teams in building scalable maintainable supportable innovative solutions to challenging problems in merchant spacework with product partners across the organization in defining product road-maps strategy and execution plan for the merchant domainbuild extend merchant data quality framework to pro-actively identify data quality issues prior to data consumption by subscriptionssupport real time and batch consumption of merchant attributes to variety of subscriptionsevangelize merchant platform and solutions and increase adoption of visa's global merchant repository within and outside of visamentor team in achieving technical and development deliverables and build a high performing world class engineering and data science teamqualificationsbs ms in computer science computer engineering science matha minimum of 10 years of experience in architecture and development 5 years with experience in data analytics building data driven solutions and frameworksproven knowledge of successful design architecture and development using hadoop with large data volumespossesses a deep understanding of text mining search technology high volume data processing and web service apiscontinuous delivery and dev ops experience - infrastructure automation monitoring logging auditing and security implementation and practicesa minimum of 3 years experience with hadoop and related technologies is requireda minimum of 5 years experience with rdbms development (oracle db2 etc ) is requiredknowledge of nosql stores (e g hbase and cassandra) technologies is a plusprior experience in managing medium size teams working with product managers in negotiating and managing deliverables schedules setting expectations is requiredadditional informationall your information will be kept confidential according to eeo guidelines visa will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of article 49 of the san francisco police code big data engineernutanix is disrupting the multibillion dollar enterprise cloud market by pioneering the first converged compute & storage virtualization platform that can incrementally scale out to manage petabytes of data while running tens of thousands of virtual machines the nutanix insights team focuses on building a big data platform to derive insights from data collected from data centers across our customer base these insights will drive the efforts towards data center optimization pro-active support and auto-remediation operations and engineering process optimizations and product features we are seeking experienced engineers and architects with big data platform infrastructure analytics expertise for petabyte scale data and a passion to drive the design of next generation data center and cloud infrastructures key responsibilitiesengineer design and build big data analytics platform for the software defined data centerbuild systems and software to deliver deep data science based insights work on performance scaling out and resiliency of distributed systems work closely with development test documentation and product management teams to deliver high quality products and services in a fast paced environmentrequirementsbs ms or ph d in computer science or related fieldexperience in big data infrastructure and software engineeringstrong knowledge of designing and development of distributed systems software on large datasets for machine learning data mining and analyticsknowledge of algorithms and optimizationexperience with big data infrastructure and working technologies such as kafka spark hadoop ecosystem and mapreducestreaming analytics and real time processing experience such as spark streaming storm is desirableexperience in designing and performance tuning data pipeline jobs is desirabledatacenter monitoring and telemetry experience such as snap collectd or any other data log collection is desirable **req id:** 31843**job family:** information technology software developmentsabre is the global leader in innovative technology that leads the travel industry we are always looking for bright and driven people who have a penchant for technology and want to hone their skills if you are interested in challenging work being part of a global team and solving complex problems through technology business intelligence and analytics and agile practices - then sabre is right for you! it is our people who develop and deliver powerful solutions that meet the current and future needs for our airline hotel and travel agency customers **job description** **work environment**the successful candidate will work in an enterprise data and analytics (eda) group as part of a next gen platform team this team works with the latest data and analytics technologies and building next generation data services platform using microservices containers and cloud native services and others the team is responsible to build and manage the platform for business units and customers solution architecture building tools components and services this platform also supports self service capabilities to users both technical and business i e developers analysts and data science platform data and services must be of high quality and reliability in batch and near real time modes the platform infrastructure is a hybrid model on premise and multi cloud the ideal candidate must enjoy being in a lead role but be able to maintain their own individual work strong individual contributor and team player! you need to have a strong background in aws duties+ design and develop big data solutions and services based on business and solution requirements as well as legal security privacy and firm policy constraints+ lead system performance benchmarking and establish standards and best practices to support high-performing computing environments + recommends development team tools processes and deliverables recommends implements process improvement + provides ongoing direction design of application suite architecture and data architecture + work cross-functionally to understand the linkage between business goals business architectures and technology architectures**job requirements** **background: required**+ ba bs degree or equivalent experience; engineering or math background preferred+ 10+ years of progressive data engineering experience with high performance data platforms on large scale development efforts leveraging cloud provider such as aws+ 5+ years of data architecture data science data integration and big data technologies+ hands on experience on data services paas for data & analytics uses cases+ hands on experience on aws cloud services (emr s3 redshift rds lambda etc )+ experience with dev ops and ci cd framework+ strong teamwork skills ability to learn quickly excellent written and verbal communication + strong programming skill i e python java scala etc + experience working on distributed data processing framework i e spark mapreduce+ experience working on realtime data processing technologies i e kafka spark streaming+ learning new & emerging technologies and provide as platform services to solve the business problem**background: preferred**+ experience with building data services platform utilizing container technologies such as docker kubernetes openshift etc + experience with microservices platform such as pivotal cloud foundry aws openshift etc + familiarity with serverless architecture i e lambda etc + familiarity with app monitoring tools like appdynamics grafana newrelic etc + hands on experience using enterprise api management solution i e apigee mulesoft+ hands on experience on infra as code technologies i e cloudformation terraform etc **reasonable accommodation**sabre is committed to working with and providing reasonable accommodation to applicants with disabilities applicants applying for a sabre position with a disability who require a reasonable accommodation for any part of the application or hiring process may contact sabre's employee relations department at employee relations@sabre com**affirmative action**sabre is an equal employment opportunity affirmative action employer and is committed to providing equal employment opportunities to minorities females veterans and disabled individuals eeo is the law at http: www eeoc gov employers upload poster_screen_reader_optimized pdf**stay connected with sabre careers** about the roletrulia is looking for an extraordinary data warehouse engineer who is passionate about all things innovative you should be comfortable fusing traditional dimensional modeling techniques with hadoop and cloud-based technologies to process massive volumes of data at scale you will design data models which meet the analytical and reporting needs of the company and you will craft etl to bring structure to the disparate data sources needed to power business decisions about the teamwe are looking for some who is creative and one who isn’t always satisfied with the status quo we'll provide the space and the environment - you'll show us what you can bring to the table we're a small team of six looking for one more unique individual in simple terms we service 30+ teams across the organization we work closely with analytics tracking data science product management and other data engineering teams to power insight and develop important data products for both our internal and external customers our primary data warehouse is hive and we utilize presto and tableau for ad hoc analysis and data exploration we explore new technologies often and always are looking for the best technologies for the job who you aredo you have experience with these tech stacks?hadoop - yarn hive presto tez orc parquet spark linux - shell scripting is something to be proud you take pride in your scripts sql and json - yes even window and lambda functions and nested datatypes don't scare you pipeline management - what can you bring? there's lots here to explore let's talk aws - packer terraform emr cost-management is the cloud *really* worth the expense? what do you think?bs in computer science or equivalent and 5 years of data warehousing or big data processing experienceif any of the above piques your interest send us your resume and a cover letter and tell us why!get to know ustrulia is a vibrant home-shopping marketplace focused on giving home buyers sellers and renters the information they need to make better decisions about where to live trulia is part of zillow group whose mission is to build the largest most trusted and vibrant home-related marketplace in the world zillow group is owned fueled and grown by innovators who help people make better smarter decisions around all things home we encourage one another at every level and our efforts are supported by employee-driven world-class benefits that enable us to enjoy our lives outside the office while building fulfilling careers that impact millions of individuals every day zillow group is an equal opportunity employer committed to fostering an inclusive innovative environment with the best employees therefore we provide employment opportunities without regard to age race ethnicity national origin religion disability sex gender identity or sexual orientation or any other protected status in accordance with applicable law through the automation of critical marketing workflows built-in fraud protection and real-time delivery of actionable insights the platform drives revenue for global companies such as lenovo ticketmaster tommy hilfiger getty images shutterstock and advance auto parts impact radius has more than 275 employees across seven offices worldwide the primary function of the senior big data engineer is to perform planning maintenance and development of all aspects of the company's data pipeline (normally known as etl - extract transform load) this person is part of an agile scrum team and is expected to mentor lead others where applicable as well this role assumes a certain independent capability to investigate and learn new technologies subsequently providing recommendations to the team backed by solid understanding of the value new technologies will provide this is a coding position and the senior engineer is expected to learn their deficiencies contribute to the impact radius engineering organization their learnings and champion best practices design implement write tests qa and deploy new applications become intimately familiar with the cloudera technology (hadoop hdfs hbase impala oozie sqoop etc )become intimately familiar with google cloud solutionsrespond to alerts review error messages and fix reported bugs quality issues in a timely manner perform data quality analysis and introduce monitors and alerts to maintain it assist users and report-writers in improving or debugging reports and dashboards work closely with director of bi data science fulfill the department's quarterly objectives assist systems group with database and other infrastructure upgrades (sometimes off-hours weekends) gain and maintain enough understanding of the business to deliver effective solutions help ensure best practices and technologies are used in the departmentrequirementsbs computer science or related field math or statistics experience with analytic databases (infobright teradata redshift etc ) experience with reporting analytic tools (cognos microstrategy tableau pentaho jaspersoft actuate etc ) unix scripting (shell perl python) experience working with large data volumes - terabytes to petabytes(preferred)start-up or internet exposure valuable strong interest in working with large data: extraction load transformation analysis and archival of it motivated self-starter who likes working on very productive fast-paced teams strong analytical and problem solving skills proven ability to make independent decisions and be effective with minimal supervision comfortable in a startup environment in which job description is constantly evolving must possess a sharp learning curve and pick up on concepts and technology quickly work and communicate effectively within core team and internal customers stakeholders positive can do attitudekey knowledge requiredadvanced knowledge of scala spark hdfs impala cassandra hbase map reduce javadatabases - normalization star schemas slowly changing dimensions indexing primary foreign key designadvanced knowledge of one or more - oracle mysql postgresstechniques to process large data volumescloud technologies solutions what you’ll be doing the verizon consumer market (vcm) vzw enterprise data warehouse (edw) marketing analytics team senior manager role will be responsible for the vision and leadership to a team of highly skilled data engineers to take verizon’s marketing and customer insights to next level of industry leading technology upgrades and execution of ongoing execution the position will manage challenging and time sensitive requests from multiple marketing retention and executive teams for business intelligence (bi) and analytic insights and delivery of ongoing initiatives to support our marketing initiatives this position will plan and proactively address these highly visible revenue and customer experience impacting initiatives in a timely manner while allocating the work modules appropriately mentor the team members and provide technical and business subject matter leadership interacting with enterprise architects project management team and other it and business partners at onshore and offshore position will provide oversight in areas of business innovation relating to building strong modern data science enablement support to centralize data hub capabilities and to ensure a consistent engagement with key business stakeholders this position will oversee a team utilizing the project lifecycle process in areas such as; agile devops waterfall methods and in specific technology areas of hadoop and cloud the position will have expanded departmental responsibilities leveraging the data management edw and big data expertise of the team to improve the customer experience and exploit new revenue-generating opportunities in joint affiliate customer and prospect marketing initiatives + provide oversight pertaining to architecture and design utilizing the business subject matter expertise for the following business areas: marketing applications for wireline and wireless campaign management loyalty and retention applications customer data integration to create 360 view sales and revenue improvement churn reduction initiatives data mining and analytics presentations to executive team and business partners + lead a team for optimize extract transform load and solution design for teradata and big data platforms like hadoop and cloud technologies for the following business areas: drive the right solution design leading team members onshore and offshore mentor and provide thought leadership optimum extract transform load solutions provide input on optimum platform choices + lead team to refine requirements and user stories with end to end architects and business partners which include leading the requirement definitions to provide maximum benefit and within feasibility define and allocate work modules to team handle escalations + coordinate the development workflow and ensure optimum resource utilization in the following areas: requirement and architecture enterprise architecture team and provide impact estimates and make scheduling decisions tactical and strategic planning on platforms capacity and technology with internal and external teams represents the analytics data management team in joint affiliate marketing initiatives + ensures projects follow appropriate agile dev operation approach for quality maintainability and security assigns adequate staffing to projects and communicates status to clients and it management works closely with production support operations and customers to ensure production problems are resolved quickly and enterprise data warehouse service level agreements (slas) are met + motivates coaches and trains staff to maximize productivity and promote their professional development staffs vacancies based on headcount and diversity targets sets employee performance objectives and manages employee performance salary planning and administration develops competencies in agile dev ops methodologies cloud architecture and security-based designs what we’re looking for you'll need to have:+ bachelor’s degree or four or more years of work experience + six or more years of relevant work experience even better if you have:+ bs in computer science + 10+ total years of related experience + 5+ years of team lead or supervisory experience + 5+ years of data warehousing & bi + strong knowledge of project planning and management concepts methodologies tools standards and procedures + demonstrated technical and analytical skills + strong knowledge of data warehousing and bi strong business knowledge especially in marketing working knowledge of information systems concepts agile dev ops and cloud excellent interpersonal and leadership skills + pc and desktop applications competency (e g microsoft office microsoft project) + good conflict resolution and negotiation skills + strong decision-making and problem solving skills+ strong organizational skills and ability to handle multiple tasks when you join verizon you’ll be doing work that matters alongside other talented people transforming the way people businesses and things connect with each other beyond powering america’s fastest and most reliable network we’re leading the way in broadband cloud and security solutions internet of things and innovating in areas such as video entertainment of course we will offer you great pay and benefits but we’re about more than that verizon is a place where you can craft your own path to greatness whether you think in code words pictures or numbers find your future at verizon equal employment opportunity we're proud to be an equal opportunity employer- and celebrate our employees' differences regardless of race color religion sex sexual orientation gender identity national origin age disability or veteran status different makes us better reqnumber: 480078-1a responsibilities:deliver data centric environment that enables the integration of data sources such that advanced analytic algorithms can be applied and yield analytic results supporting visualizations to inform senior level decision makers and line level analyst perform security integration auditing and monitoring in accordance with client standards and processes to include accurate reporting procedures for internal use software expenditurestechnical subject matter expertise in software coding and unit level testing including but not limited to java python big data hadoop apache spark hbase accumulo nosql mongodbdraw on existing data service architecture (dsa) architecture security and application requirements processes take advantage of lessons learned from previous projects leverage existing available client data repositories and services client software development environment devops processes eps data and platform capabilities dodiis available infrastructure (including but not limited to: c2s aws govcloud vmware physical systems) and leverage existing client and future environment tools and services perform data analysis against clients data holdings implementing industry best practices such as data mining predictive analytics text analytics or other techniques as appropriate provide support for the "data retrograde" tool that provides a solution for sorting out relevant documents from a larger data set and operate maintain and update a machine learning data model rules engine and workflow within a web application platform that allows a user to curate documents against a binary decision provide smes experience in cloud technologies aws c2s containers data layers micro-services system administration and nsql database experience qualifications:must have an active current ts sci a bachelor's degree or equivalent training seven years or more years of specialized experience working on complex data database projects as a data analyst data architect data scientist or database engineer produces and reports timely and accurate analyses at the appropriate level of detail to enable business decisions support the client with comprehending the context of its program data by extracting qualitative and quantitative relationships including patterns and trends from large amounts of data and providing analytic support to help inform policy rational decision-making and resource allocation provide support for it strategic implementation coordination of project management and communications support efforts across a matrixed organization **technology doesn't change the world people do **as a technology staffing firm we can't think of a more fitting mantra we're extreme believers in technology and the incredible things it can do but we know that behind every smart piece of software every powerful processor and every brilliant line of code is an even more brilliant person **leader among it staffing agencies**the intersection of technology and people — it's where we live backed by more than 65 years of experience robert half technology is a leader among it staffing agencies whether you're looking to hire experienced technology talent or find the best technology jobs we are your it expert to call we understand not only the art of matching people but also the science of technology we use a proprietary matching tool that helps our staffing professionals connect just the right person to just the right job and our network of industry connections and strategic partners remains unmatched apply for this job now or contact our branch office at 888 674 2094 to learn more about this position all applicants applying for u s job openings must be authorized to work in the united states all applicants applying for canadian job openings must be authorized to work in canada © 2018 robert half technology an equal opportunity employer m f disability veterans by clicking 'apply now' you are agreeing to robert half terms of use *req id:* 00320-0010353666*functional role:* network engineer*country:* usa*state:* ca*city:* los angeles*postal code:* 90048*compensation:* $90 000 00 to $110 000 00 per year*requirements:**data engineer*salary: $90 000-$110 000location: west laemployment type: full timetitle: mid-level data science data engineertarget salary: $90 000-$110 000benefits: 401k health dental vision paid holidays amp; vacationsrobert half technology is looking for a data engineer for a rapidly growing company this person will need to have demonstrated experience in spark hadoop kafka and other open-source technologies to drive etl data roadmap and use innovative technologies like spark hadoop kafka and other open-source!**for immediate and confidential **consideration please email your resume to danielle abramov nbsp;at**danielle abramov@rht com** ******_requirements:_*** 2+ years working with big data* experience working with azure infrastructure* demonstrated ability in data modeling etl development and data warehousing * experience with kafka hadoop spark or other data processing tools* attention to detail and a passion for correctness* exposure and knowledge of security encryption and data governance **_education experience:_*** bs microsoft or phd in cs or equivalent real-world experience**for immediate and confidential **consideration please email your resume to danielle abramov nbsp;at**danielle abramov@rht com** **** the information and data architect role is concerned with updating modifying deploying and managing the alis system information and data architecture this role will focus on how information from the alis business aspect is transformed into the necessary system data that is to be defined stored consumed integrated and managed by different entities and it systems that alis interacts with as well as with any applications using or processing that information data the information and data architect – within an agile framework - will work closely with customers to identify critical business information needs and align those needs with the work being performed by the domain systems engineers interface leads and system designers the information and data architect will be responsible for a standard common business vocabulary that expresses strategic data requirements and outlines high level integrated designs to meet these requirements and align with enterprise strategy and the related business architecture integral to this role will be the development of domain object (dom) and data models (dm) leveraging model based system design (mbsd) techniques using sysml and dodaf updm principals the modeling tool of choice on alis is enterprise architect (ea) the selected candidate will be expected to set alis information and data architecture principles create models of information and data that enable the implementation of the intended business architecture create diagrams showing key information and data entities and create an inventory of the data needed to implement the architectural vision of the alis product primary responsibilities of the information and data architect include:1 define and organize information and data at the macro (system) level2 assist with the definition and organization of data at the micro level through data models and data dictionaries3 provide a logical data model as a standard system source for consuming applications to inherit and draw from4 provides a logical data model with elements and business rules needed for the creation of data quality (dq) rules5 provide technical input and decisions as a primary member of the seit architecture and engineering (a&e) group within the alis engineering teambasic qualificationsbachelor's degree from an accredited college in a related discipline with 9 years of professional experience; or 7 years of professional experience with a related master's degree - significant experience in software intensive systems engineering design requirements analysis and decomposition and a solid understanding of the systems engineering lifecycle- experience defining decomposing and reviewing requirements for completeness and testability- data modeling experience- must have the ability to obtain a secret security clearancedesired skillsms or bs in computer engineering computer science information systems or systems engineering combined with systems and or software engineering experience- jsf alis experience- familiarity with agile development- systems modeling and design (sysml) and experience working with an enterprise-level toolset (e g enterprise architect)- business and information modeling- experience following an implementation through the full life cycle- experience with svn & jira- experience managing requirements in doorsas a leading technology innovation company lockheed martin’s team of 113 000 people works with partners around the world to bring proven performance to our customers’ toughest challenges lockheed martin has employees based in all 50 states and more than 570 facilities that span 70 countries join us at lockheed martin where we’re engineering a better tomorrow *lockheed martin is an equal opportunity affirmative action employer all qualified applicants will receive consideration for employment without regard to race color religion sex national origin age protected veteran status or disability status *job location(s): orlando florida why we’re excited to get to work:the mission for payformance solutions is simple we aim to be a catalyst for payment transformation in the healthcare industry our proprietary software solutions allow payers and providers to focus on what really matters: providing patients with access to care that yields the best health outcomes at the lowest costs the healthcare industry is complex and fragmented payers and providers are faced with a lack of transparency and conflicting financial goals that fail to consider the health outcomes of patients payformance solutions offers data-driven turnkey software solutions that provide payers and providers with the technical tools and resources needed to design evaluate build measure and negotiate value-based reimbursement contracts — as a neutral third party our holistic solutions allow payers and providers to collaborate in an ecosystem that aligns financial goals with patient outcomes what we are looking for:our team needs hands-on engineers who can produce beautiful & functional code to develop software solutions that solve complex healthcare transformations facing payers and providers if you are an exceptional developer with an aptitude to learn and implement using new technologies and who loves to push the boundaries to solve complex business problems innovatively then we want to talk with you as a lead architect at payformance solutions you will have the opportunity to make mission critical software and web development contributions you’ll have a voice in driving the direction and strategy of the company and work in an agile environment alongside a rapidly growing team of engineers data architects analysts and health policy experts in our chicago il office position is based in chicagowhat you’ll do:architect and implement these roadmaps and bring to life revolutionary new analytics and insights provide technical direction to the engineering and application team collaborate with cross-functional teams to utilize the new big data toolsdesign and develop code scripts and data pipelines that leverage structured and unstructured data participate in requirements and design technical workshops with platform users evaluate new big data technologies provide architecture and technology leadership for aws develop and recommending novel and innovative -- yet proven and demonstrated -- approaches to solving business and technical problemresponsible for development using java scala python language and big data frameworks hdf storm spark s3 data pipeline reviewing analyzing and evaluating market requirements business requirements and project briefs in order to design the most appropriate end-to-end technology solutionsintegrating with external data sources and apisusing hadoop and related tools to manage the analysis of billions of healthcare claim transaction recordswork with the developers business analysts and subject matter experts to understand the complex technological system in order to produce integrated end-to-end solution optionswhat you’ll bring to the table:outstanding verbal and written communication skills excellent presentation and whiteboarding skills bs (masters or phd preferred) degree in relevant disciplines: bioinformatics medical informatics healthcare administration statistics applied mathematics operations research optimization computer science computational theoretical physics data science or electrical computer engineering six or more years of work experience experience working with data data warehousing & bi self-motivated enthusiastic and a quick learner you should have a broad base of experience and be interested in continuing to grow technically via hands on experience and learning desire and ability to quickly learn new technologies on your own knowledge of or willingness and aptitude to learn healthcare revenue cycle and claims data must have experience with aws data stack using s3 emr data pipeline how we’ll support you:in addition to the meaningful and challenging work payformance’s dynamic work environment emphasizes integrity personal commitment and teamwork we offer an outstanding benefits program that includes:a competitive annual salary100% employer-paid medical option (employee spouse family)100% employer-paid life insurance policy100% employer-paid short-term and long-term disability insurance401k retirement plan (5% employer contribution)healthy work life balance… our flexible office hours and time-off allows you to make the most of your time in and out of the office to fit your needs plus an optional work-from-home day once a week (after 3 months)paid maternity leavetuition reimbursementannual personal learning budget (books conferences etc )thirsty thursday team building eventsfree gym accessawesome coworkerswork with a highly collaborative and values-driven team not to boast but a little bit about us:payformance solutions is a health-tech company dedicated to advancing payment transformation in the healthcare industry we are a wholly owned subsidiary of altarum institute a nonprofit systems research and consulting organization that has been servicing government and private sector clients since 1946 altarum institute combines the analytical rigor of a research institution with the business agility of a consulting firm the institute is uniquely positioned to deliver practical systems-based health and healthcare solutions to its clients altarum’s nonprofit status ensures that the public interest is always preeminent in our work our dedication to social responsibility is evident in all that we do serving the public good with integrity and enabling others to do the same altarum develops and promote best practices in the application of information technology to health and health care applying systems research principles and analytic objectivity we work to increase access to health information; improve the organization and usability of health information; and develop new knowledge from health information we work to achieve these goals by addressing all aspects of information technology —policy and planning efforts system design and development information exchange and the management and evaluation of specific information technology and strategies at payformance we don’t just accept difference - we celebrate support and thrive on it for the benefit of our employees our clients and our community payformance solutions is proud to be an equal opportunity employer all qualified applicants will receive consideration for employment without regard to among other things race color religion sex sexual orientation gender identity national origin age marital status status as a protected veteran or disability to all recruitment agencies: payformance does not accept agency resumes we are not responsible for any fees related to unsolicited resumes at pandora we're a unique collection of engineers musicians designers marketers and world-class sellers with a common goal: to enrich lives by delivering effortless personalized music enjoyment and discovery people—the listeners the artists and our employees—are at the center of our mission and everything we do actually employees at pandora are a lot like the service itself: bright eclectic and innovative collaboration is the foundation of our workforce and we’re looking for smart individuals who are self-motivated and passionate to join us be a part of the engine that creates the soundtrack to life discover your future at pandora the position of senior data engineer analytics is a key member of the subscription and billing team bridging the analytics of pandora subscription data with the verification and support of the product development and engineering organizations data analytics is a core function at pandora and we are really looking forward to bringing on someone who loves to dig into the data and find new ways of looking at a problem this is a strategic role that will influence the growing data and analytics that enhance our understanding of the pandora user base the role is a perfect fit for someone who has a passion for analytics and data plus a solid background in software development you must be capable of managing your time well and working collaboratively excellent communication skills both written and verbal are required responsibilities: you will create quality assurance testing strategies sample data and execute against them helping to provide secure and accurate data pipelines throughout many of our systems analyze data to confirm relationships and identify potential data quality issues design monitor and maintain reports kpis and quality trends for various components and system data perform root cause analysis on identified data issues helping to improve our overall data systems develop utilize testing software that will contribute to building flexible and scalable data testing automation solutions identify and communicate risks with respect to upcoming releases projects verify accuracy of data sets and ensure the data is delivered in the correct layout format schema contribute in product requirement meetings helping to influence the design and development of new features projects from a data and quality perspective serve as the liaison between the engineering team in key data stakeholders including product analytics marketing analytics and the subscription business teams qualifications5 years development experience and 2-3 years of data or analytics engineering experience working with big data technologies ala hadoop environment2 years of experience test automation experience - specifically testing data pipelines data quality and apis highly detail oriented with strong analytical skills ability to read write complex sql queries to analyze data investigate systemic defects and validate data quality development of high performance sql hive queries and data transformation jobs strong relational database skills - design scalability data analysis mapping and modeling ba bs or better in computer science or a related field such as mathematics data science economics machine learningplusesfamiliarity with hadoop mapreduce hive sql json spark teradata or oracle data technologies unix linux experience and scripting skills (shell perl python etc )pandora is committed to diversity in its workforce pandora is an equal employment opportunity employer and considers qualified applicants without regard to gender sexual orientation gender identity race veteran or disability status women and people of color are encouraged to apply pandora is also a vevraa federal contractor pandora requests priority referrals of protected veterans from each esds as required by regulation if you believe you need a reasonable accommodation in order to search for a job opening or to apply for a position please contact us by sending an email to disability@pandora com this email box is designed to assist job seekers who require a reasonable accommodation to the application process a response to your request may take up to two business days in your email please include the following:- the specific accommodation requested to complete the employment application - the location or office to which you would like to apply- the subject of the email should read "request for reasonable accommodation" data piper is a premier cloud solution data analytics data management and it consulting firm we work with organizations to develop and implement road maps for all aspects of it to improve business efficiency along with maintaining commitment to set the highest standards in service efficiency solutions are tailored to client needs by following innovative and proven methodologies corporate ethics strategic planning with utmost commitment to customer satisfaction whatever your it requirement is data piper infotech offers a world of possibilities and solutions what you’ll dobuild big data pipelines and structures that power predictive models and intelligence services on large-scale datasets with high uptime and production level quality implement and manage large scale etl jobs on hadoop spark clusters in amazon aws microsoft azureinterface with internal data science engineering and data consumer teams to understand the data needsown data quality throughout all stages of acquisition and processing including data collection etl wrangling ground truth generation and normalization; what you need to succeedmaster or bachelor with equivalent experience in computer science or equivalent technical fields2+ years of experience working with large data sets using open source technologies such as spark hadoop kafka on one of the major cloud vendors such as aws azure and google cloud;strong sql (postgres hive mysql etc) and no-sql (mongodb hbase etc ) skills including writing complex queries and performance tuning;must have good command of python spark scala and big data techniques (hive pig mapreduce hadoop streaming kafka)excellent communication relationship skills and a strong team player if interested for this opportunity then click apply or reach out to me directly for more details mritunjay@datapipertech com 978-216-9214 thanks mritunjay (mj)senior technical recruiterm: 978-216-9214 609-325-3641data piper infotech - provided by dice big data hadoop hdfs mapreduce hive pig spark kafka cloud aws azure hbase mongodb etl big data developerapply nowsign up for similar job alert!job id :11560company :internal postingslocation :woodcliff njtype :contractduration :6 monthsrate :dep on exp salary :openstatus :activeopenings :1posted :3 weeks agojob seekers please send resumes to resumes@hireitpeople com or call: (202) 719-0200 ext: 207description:+ the market intelligence goal is to complete data ingestions data preparations to enable a consistent and innovative services based model supporting data lake smart analytics visualizations predictive and prescriptive concepts to reality+ import and export data between an external rdbms and clusters + proven understanding with hadoop hive spark hdfs and scala + knowledge in hadoop ecosystem + good knowledge of database structures theories principles and practices + hands on experience in data loading tools like flume sqoop nifi + hands on experience in hiveql + knowledge of workflow schedulers like oozie + knowledge of talend for data extraction data transformation and data loading + writing high-performance reliable and maintainable code + analytical and problem solving skills applied to bigdata domain + sharp analytical abilities and proven design skills + knowledge or experience in building data integrations understanding enterprise integration patterns and conceptsday to day job duties: (what this person will do on a daily weekly basis)+ import and export data between an external rdbms and hadoop clusters + develop ingestion and data preparation scripts + design data ingestion and data loading framework to allow smooth and error free data migration for initial and incremental loading + define data mapping for source to target + define workflow and schedules using schedulers like oozie + develop scripts using talend for data extraction data transformation and data loading + writing high-performance reliable and maintainable code + understand and design the target data structure to build data integrations utilizing enterprise integration patterns and concepts + collaborate with data scientist to design target data model that will help in predictive analytics and dashboards + basic qualifications: (what are the skills required to this job with minimum years of experience on each)+ minimum 4 years of experience in big data + proven understanding with hadoop hive spark hdfs and scala + knowledge in hadoop ecosystem + good knowledge of database structures theories principles and practices + hands on experience in data loading tools like flume sqoop nifi + hands on experience in hiveql + knowledge of workflow schedulers like oozie + knowledge of talend for data extraction data transformation and data loading + analytical and problem solving skills applied to bigdata domain + sharp analytical abilities and proven design skills + minimum 5 years of experience in global team environment and working with offshore team + minimum 3+ years of experience conceptualizing and architecting the target environment for big data travel: no travel person is required to be operate 5 days a week in nj office degree: bachelors in computer science or equivalent work experiencenice to have: (but not a must)+ project lead and architecture experience infoblox is seeking a big data engineer who is eager to join an agile and highly collaborative team you will join a team that builds real-time streaming analytic solutions that takes networking data and provides insights to our customers you will be responsible for horizontal scale as well as developing new insights into that data this engineerwillbebuilding a real-time analytics framework that ingests and analyzes billions of records per day working within a resilient infrastructure with no single point of failure this is an exciting opportunity to work with a team of expert data scientists and engineers responsibilities:design develop maintain and test big data solutionsbuild large-scale data processing systems using cloud computing technologiescomplex big data application with focus on collecting parsing managing and analyzing large sets of data to turn information into insightsself-starter able to learn new technologies and systems on your ownrequirements:5+ years experience in software development using java scala golang and or python2+ years experience in big dataability to architect highly scalable distributed systems using different open source technologiesexperience building high-performance algorithmsexperience with spark mapreduce zookeeper kafka cassandrastrong technical foundation in information systems and databasesfull software development life-cycle experience with proven track record of shipping productsresults oriented solid work ethic with excellent attention to detail and qualitythis is a hands-on position must be able to write code individually or part of a teamexcellent verbal and written communication skillsexperience working in an agile development environmentpreferred experience:experience with impala athena elasticsearch dynamodb and or comparable technologieseducation:bachelor's in cs ce or ee is requiredmasters in cs ce or ee is preferredit's an exciting time to be at infoblox we are the market leader in technology for network control our success depends on bright energetic talented people who share a passion for excellence in building the next generation of networking technologies-and having fun along the way infoblox offers a fast-paced action-oriented environment we promote a culture that embraces innovation change teamwork and strong partnerships join the winning infoblox team-our future looks bright and so will yours to check out what it's like to be a bloxer click here data architect modelerreston va + 50% travel (denver co calgary canada)us citizenship req required qualifications:10+ years of consulting and software development experience including 3+ years of in a combination of relevant big data analytics areasproven ability to leverage and integrate technology stacks into an overall solution architecture in support of business requirements and use casesexperience with a variety of platforms languages frameworks modeling and methodologiesknowledge of data architecture data modeling advanced analytics or data science implementations experience presenting technical concepts and solutions (including whitepapers) to business and technology stakeholders technology skills:relational databases such as oracleexperience with sqlextract transform load (etl) tools such as pentaho or informaticareporting tools or platforms like cognos microstrategy tableaulog analysis tools such as splunk or the elasticsearch logstash kibana (elk stack)one or more distribution of hadoop such as cloudera hortonworks mapraws platform including ec2 s3 and redshiftexperience with at least one procedural language: python r scala or java your tasks and responsibilitiesthe primary responsibilities of this role seniordata scientist data engineer - biologics areto: + working closely with research & developmentinformation technology (r&d it) and other globalpartners;+ design and extend new and existing datarepositories including the design of a datalake;+ automate data extraction transport transformationand preprocessing pipelines and processes;+ will manage and code ad hoc queries to enable dataanalytics;+ collaborate with internal and external colleaguesand vendors to improve the operation of the data science team'sinfrastructure;+ participate in problem solving with team membersfrom other functional teams and sites;+ communicate effectively through listening documentation and presentation especially using compellingvisualization tools who you areyour success will be driven by your demonstration ofour life values more specifically related to this position bayer seeks an incumbent who possesses thefollowing:required qualifications:+ phd in data science computer science statistics or related field with minimum of 4 years relevant experienceor;+ ms in data science computer science statistics or related field with minimum of 8 years relevant experienceor+ bs in data science computer science statistics or related field with minimum of 10 years relevantexperience;+ proficiency in python sql and or r is required prior experience with extract transform and load (etl) pipelinesis a must + experience leading a small team in software projectmanagement machine learning experimental design and statisticaldesign are desirable + excellent communication skills with scientists fromdiverse backgrounds ability to work independently and as a part ofa project team ability to prioritize tasks and work on adeadline preferred qualifications:+ 3+ years of experience inindustry;+ familiarity and exposure to cloud platformadministration;+ experience with nosql databases domestic relocation assistance is offered for thisposition your applicationbayer offers a wide variety of competitive compensation and benefits programs if you meet the requirements of this unique opportunity and you have the "passion to innovate" and the "power to change" we encourage you to apply now job postings will remain open for a minimum of ten business days and are subject to immediate closure thereafter without additional notice to all recruitment agencies: bayer does not accept unsolicited third party resumes bayer is an equal opportunity employer disabled veteransbayer is committed to providing access and reasonable accommodations in its application process for individuals with disabilities and encourages applicants with disabilities to request any needed accommodation(s) using the contact information below country:united stateslocation:ca-west sacramento greetings hope you are doing great we at xoriant looking for senior data engineer @ sunnyvale ca position with our direct client if you are interested with the jd please revert me with updated resume asap senior data engineersunnyvale ca6+ monthsneed locals (f2f is mandatory) description:position summary: very strong engineering skills should have an analytical approach and have good programming skills provide business insights while leveraging internal tools and systems databases and industry dataminimum of 5+ years’ experience experience in retail business will be a plus excellent written and verbal communication skills for varied audiences on engineering subject matterability to document requirements data lineage subject matter in both business and technical terminology guide and learn from other team members demonstrated ability to transform business requirements to code specific analytical reports and toolsthis role will involve coding analytical modeling root cause analysis investigation debugging testing and collaboration with the business partners product managers other engineering team must have strong analytical backgroundself-startermust be able to reach out to others and thrive in a fast-paced environment strong background in transforming big data into business insights technical requirements knowledge experience on teradata physical design and implementation teradata sql performance optimizationexperience with teradata tools and utilities (fastload multiload bteq fastexport)advanced sql (preferably teradata)experience working with large data sets experience working with distributed computing (mapreduce hadoop hive pig apache spark etc ) strong hadoop scripting skills to process petabytes of dataexperience in unix linux shell scripting or similar programming scripting knowledgeexperience in etl processesreal time data ingestion (kafka) nice to have development experience with java scala flume pythoncassandraautomic schedulerr r studio sas experience a plusprestohbasetableau or similar reporting dash boarding toolmodeling and data science backgroundretail industry background education: bs degree in specific technical fields like computer science math statistics preferred thanks & regards ravalika pallikondarecruiterdesk: +1 408-484-3157 | ravalika pallikonda@xoriant com - provided by dice teradata physical design and implementation teradata sql performance optimization; fastload multiload bteq fastexport; mapreduce hadoop hive pig apache spark; etl the mdm analyst and developer will maintain support and enhance the master data management component of an existing enterprise architecture (ea) platform the scope of services for the mdm analyst and developer includes the following:provide configuration problem resolution programming analytical and support services for the mdm component of the ea projectprovide ongoing application and technical support as related to this component to ensure business continuityserve as the component lead for the master data management (mdm) component of the ea projectwork collaboratively with other component leads and personnel in other disciplines to implement new technologies necessary for the ea project work closely with the project management office personnel to ensure adherence to project standardsprovide onboarding estimates and assistance for new projects that plan to utilize the mdm componentwork in a highly cooperative collaborative environment with individuals at all levels both functional and technicalprovide excellent communication skillsshow initiative and be proactive in anticipating needsexperience and expertise in the following areas are strongly preferred: 5+ years of experience with enterprise java developmentproject experience integrating with and or configuring a master data management tooldemonstrate an understanding of master data management domain concepts including golden record management reference data management and data stewardshipexperience and expertise in the following areas are preferred: experience installing and administering ibi omnigen definitive logic is a community of experts and we seek to cultivate the highest level of talent with the best education and professional experience available our team is highly motivated to deliver the best innovation to our clients and partners we're looking for a senior engineer scientist analyst to work in an engineering environment focusing on engineering and or scientific studies and analysis providing technical solutions we're looking for someone to support programs with exceptional creativity and resourcefulness in the most demanding and complex assignments you will perform analyses and develop recommendations that may impact acquisition programs and activities you will also provide technical direction and perform complex analyses and you will provide design implementation and testing services for complex information systems requirementsbachelor's degree in system engineering computer science or other related fieldat least 10 years of experiencedesired skills and experienceat least five (5) years of information systems management experience sw skills: html visual basic java jsp java struts java script java servlet pl sql cvs eclipse apache weblogic iis oracle obiee oracle data visualization d3 ms access sql agile software development software development life cycle at least five (5) years of data warehousing experience including familiarity with data modeling data warehousing agile software development software development life cycle concepts and implementing hyperion interactive reporting and obiee+ solutions at least five (5) years of experience in architecting bi or web application system solutions at least five (5) years of software development or cots solution deployment project management experience at least five (5) years of experience with ppbe process data structure (program budget acquisition) with bi solutions and related products and systems dod 8570 01-m iat level ii certification applicants selected will be subject to a security investigation and must meet eligibility requirements for access to classified information about definitive logicdefinitive logic is a growing small business with a focus on providing technical solutions to public sector and commercial organizations we invest in developing our team members into technical leaders in their fields on interest if you are looking to be a member of a talented company with a small business culture we look forward to hearing from you this is an opportunity to architect scale and maintain world-class analytics and optimization infrastructure that supports multiple lp digital businesses this is a full-time position out of our delray beach office remote option may be available for the right candidate you must be able to leverage big data and data mining technologies to solve a diverse set of problems this is not “just” an engineering role this is an entrepreneurial position that requires a combined passion for working with high volume data sets and understanding the marketing product aspects of lp brands to help grow the business as a whole you must have at least 5 years of hands-on experience creating maintaining etl processes at scale building real-time dashboards and managing data warehouse solutions strong proficiency in python and sql required who we arelaunch potato is a profitable startup studio that incubates and launches mobile and web companies on our proprietary technology stack we’re hqed in delray beach fl but have an amazing distributed global team we believe in building teams who can solve complex problems using great engineering smart marketing data science and fun!the role - tl;drlaunch potato collects millions of data points each day across several consumer-focused digital businesses you’ll be responsible for implementing big data and data mining technologies that centralize this information so marketing can real-time reporting you’ll design and develop high-performance fault-tolerant and scalable distributed systems you’ll optimize existing etl pipelines and create new ones that effectively manage our large data sets while learning the ins and outs of the businesses in which we are tracking user behavior you’ll centralize data in our data warehouse create reports through data visualization tools like looker and proactively find new ways to improve the stability and speed of data pipelines still interested? read onour entire business is driven by analytics you’ll interact with marketing social customer service teams to understand reporting requirements then bring data from various silos together into one effectively managed data warehouse by optimizing data models you’ll enrich the team’s understanding of how we acquire users how users engage and how we can most quickly identify new opportunities to grow our businesses you’ll work with a variety of technologies on a day-to-day basis including redshift postgresql mysql mongodb redis elastic mapreduce (aws) elasticsearch and hadoop hive pig projects will include managing data warehouse solutions working with engineering to consume tracking data and developing a flexible data model for unstructured data you’ll add new features for more flexible management of jobs and optimize dependency structures you’ll stay updated on industry best practices and advocate for data quality the perks• lead data engineering for multiple high-traffic growing digital businesses• amazing location in vibrant downtown delray beach florida• energetic fun startup culture• work directly with founding team 4 young internet execs who successfully developed and scaled double-digit million dollar online businesses from the ground up in the past decade• international exposure working with an amazing talented team distributed across the world• unlimited career growth and profit-sharing opportunitiesbenefits include: health insurance dental insurance profit-sharing 401k fully stocked kitchen with snacks and drinks free lunch fridays flexible vacation policy two annual company retreats with global team a kickass working environment!what you'll do• build and maintain various data stores• own the end-to-end data processing pipeline components in virtual and cloud-based environments for a variety of digital businesses• develop and maintain cross-platform etl processes• maintain and enhance scalable web-based visualization tools (looker)• routinely audit data loading process and monitor data anomalies• manage resource utilization optimizing instance sizing storage capacity and performance• develop and maintain third party api processes; work with third parties to get what we need• work with marketing team to qa all partners’ incoming data• optimize data modeling based on usage patterns• document structure and processes never assuming certain intricacies aren’t important• perform routine maintenance on production database systems• work with engineering team members to review ddl perform production database modifications write stored procedures and help optimize queries• enhance our existing database analysis and monitoring infrastructure• act as internal primary point of contact for all reporting requests• work closely with cross-department teams to keep data flowing and ensure the proper tracking and tools are used for new development• be responsible for reporting costs for data storage; work to keep costs as low as possible• perform ad hoc queries and data analysis requests as needed• grow and develop team of data engineerswho we're looking for• minimum: at least 5 years of hands-on experience in etl engineering building distributed systems at scale• strong python programming skills and experience• strong sql skills and experience• passionate about database technologies• experience using a data visualization tool like looker• experience developing real-time dashboards• understands data normalization and how usage of such things can impact performance positively and negatively• strong linux skills and experience• very familiar with aws• works well with others• understands basic marketing concepts• cs degree or equivalent experience• is open to new technology choices but tempers such choices responsibly• extremely detail oriented• extremely resourceful and enjoys overcoming challenges• able to prioritize initiatives based on impact to business• thrives in a fast-paced environment and can balance multiple tasks simultaneously in a startup atmospherebonus qualifications• snowplow• machine-learning• statistical modeling• java programming skills• integrating reporting with slack• experience integrating reporting with advertising platform data such as adwords facebookavailabilityour normal team work hours are roughly 10am-5pm et m-f however as a digital business you'll be expected to monitor throughout the week and on weekends as needed additionally we are a distributed digital team and there may be late early meetings due to different time zones we always try to work with everyone's schedules but at times there may be meetings earlier in the morning and later at night want to be part of a successful high-growth company? join the lp team!send the information below to bigdata@launchpotato com with a subject line that includes “{your name} senior software engineer” and:1 brief cover letter that explains why you're qualified for this role and why working for launch potato appeals to you 2 what do you see as the biggest challenge for you in this role?* applications without the information above will not be considered sense360 is the leader in real-time 360° insights which are revolutionizing the current state of market research that's plagued by high-cost static solutions that are neither timely nor accurate through analyzing and combining massive datasets of digital behavioral data with attitudinal surveys sense360 creates customized industry solutions that enable continuous measurement and optimization of its clients' business strategies and tactics sense360 was founded by successful repeat entrepreneurs and is funded by investors of pinterest uber hoteltonight riotgames and twilio sense360 has identified a massive market opportunity and we are looking for a data engineer to help take us to the next level you will have a large and immediate impact on the company as you will be the 5th engineer on the team we are looking for a scrappy engineer with solid cs fundamentals who focuses on delivering value quickly as a data engineer you will be responsible for architecting our data infrastructure that processes 2+ tb sensor data day in an efficient manner you will be working closely with our data science team to scale their processing algorithms you will be working heavily with apache spark aws python and apache airflow qualifications​7+ years experienceadvanced knowledge of a distributed processing technology (e g spark storm presto hadoop samza flink etc)bias for delivering value over beautiful solutionsstrong desire to work in a startuppassionate about datasolid cs and testing fundamentalsour challengesour challenges revolve around scaling data and automation - architect our data pipeline which handles 2+ tb sensor data day from 2mm+ users using aws emr s3 lambda and redshift- build platforms for our data team to allow them to iterate build and deploy their models and reports- architect our system to become a real-time data system- scale and optimize our spark data processing clusters to be able to handle 5mm+ users- research analyze and integrate new 3rd party datasets into our dataset- research new data technologies to take us to the next level of scale and functionality- develop best practices for data ingestion data cleaning and data processingour core values1 we are one team - we are one team with a single goal - to build an amazing company if the company wins we all win we build the team by looking for amazing teammates who will elevate the team over individual genius 2 always improving - we believe that best people never stop learning and growing and that the only way to do that is to be humble and eager to be better we crave feedback we aren't defensive and we internalize it we are also equally as comfortable giving constructive and helpful feedback 3 we are all owners - we empower everyone at the company to own their work owners want lots of responsibility have authority and freedom to make real decisions surface bad news quickly and ask for help when they need it a company of owners also means that ideas come from everywhere that we all learn from each other and titles and seniority don't matter 4 scrappiness - we are trying to do something transformative while also taking on the biggest and most resourced companies in the world the only way we can pull off the impossible is to simplify everything to its core only spend time and money on what truly matters and work very very very hard 5 we have fun - building a company is hard work but is also needs to be incredibly fun and rewarding we don't just work together; we also make the time to have fun this includes team outings dinners ping pong halo and clipper games willing to relocate or sponsor visas **88158br****job title:**data architect l48**segment:**upstream**role synopsis:**the data architect is responsible for designing executing and maintaining the data architecture for bp l48 the data architect must be an expert in sql development further providing support to the data and analytics team in database reviews data flow and data element analysis whereas data engineers can understand data interactions in source systems the data architect can apply their knowledge to how data systems interact with each other the position of the data architect is the technical lead in the mastering of data (mdm) to enable advanced analytics and data processing this person must also be comfortable operating as an individual contributor and using influence and expertise to aid the transformation of an organization **req id:**88158br**location:**united states - colorado - denver**is this a part time position?:**no**relocation available:**yes - domestic (in country) only**travel required:**yes - up to 25%**key accountabilities:**+ define implement and evolve business rules in bp l48 + act as company subject matter expert and internal consultant on evolving legacy systems into sustainable solutions through replacement or architecture + analyze structural requirements for new software and applications and ensure they meet company mdm requirements + ensure all systems comply with company data protection security integrity and cyber security requirements + define archive back up retention and deletion procedures for company data + plan and oversee execution of integration of disparate source systems + coordinate with the data science group to identify future needs and requirements + be knowledgeable of industry trends and best practices advising senior management on new and improved data & analytics strategies that will drive departmental performance leading to improvement in overall improvement in data governance across the business promoting informed decision-making and ultimately improving overall business performance + design l48 data systems planned data disposal systems (archiving and deletion) + designs end to end data architecture and data flows+ ability to design and develop databases data warehouses and multidimensional databases + determines database structural requirements by analyzing client operations applications and programming; reviewing objectives with customers' and data engineers; evaluating current systems + designs the implementation of database systems by developing flowcharts; applying optimum access techniques; coordinating installation actions; documents actions + in conjunction with data engineers maintains database performance by identifying and resolving production and application development problems; calculating optimum values for parameters; evaluating integrating and installing new releases; completing maintenance; answering user questions + updates job knowledge by participating in educational opportunities; reading professional publications; maintaining personal networks; participating in professional organizations + accomplishes information systems and organization mission by completing related results as needed leadership+ generates enthusiasm among team members + proactively seeks opportunities to serve in leadership roles + challenges others to develop as leaders while serving as a role model + manages the process of innovative change + represents team excellence and bp values when interacting with internal customers + collaborates with and influences others not in direct line teamwork+ facilitates effective team interaction + acknowledges and appreciates each team member's contributions + works effectively with distributed team members**essential education:**+ bs in computer science mathematics or equivalent degree ms preferred **desirable criteria and qualifications:**+ experience with research methods statistical or data analysis such as spss sas+ data manipulation software – mds dq+ experience with esb soa or eventing environments+ supervision management or leadership position experience desirable + desire to continually learn outside of a classroom environment and successfully apply learnings+ demonstrated willingness to both teach others and learn new techniques**about bp:**bp's us lower 48 (l48) onshore business operates across a vast us geography from texas north through the rocky mountains the business manages a diverse portfolio which includes an extensive unconventional resource base of about 7 5 billion barrels of oil equivalent across 5 5 million gross acres in some of the largest and most well-known basins in the us headquartered in houston (texas) l48 employs about 1 700 people across six states operates more than 9 600 producing wells and has 70 000 royalty owners our vision is to be the most respected and admired oil and gas company in the lower 48 us states our wyoming operations are anchored on the giant wamsutter tight gas field in the south central part of the state in the san juan area of colorado and new mexico we produce from tight gas sands and operate the largest coal-bed methane field in the us our mid-continent operations cover the prolific anadarko and is home to the famed east texas basin along with the woodford shale gas play and arkoma basin we also have non-operating interests in over 10 000 wells across the us with substantial positions in both the eagle ford and fayetteville shale basins in 2015 the l48 onshore is being established as a separate business within bp’s upstream to manage its onshore oil and gas assets across the us onshore this effort is being undertaken to improve competitiveness and help l48 remain at the forefront of innovation and development of technologies for unconventional resources *li-nowrap**application close date:**13-apr-2018**sub-category:**business analysis & consulting**job category:**information technology & services**countries (state region):**united states - colorado**disclaimer:**if you are selected for a position in the united states your employment will be contingent upon submission to and successful completion of a post-offer pre-placement drug test (and alcohol screening medical examination if required by the role) as well as pre-placement verification of the information and qualifications provided during the selection process the drug screen requires a hair test for which bp must be able to obtain a sufficient hair sample for analysis (~4 cm 1 ½” scalp or 2 cm ¾” body – arms & armpits legs chest)as part of our dedication to the diversity of our workforce bp is committed to equal employment opportunity applicants will receive consideration for employment without regard for race color gender religion national origin disability veteran status military status age marital status sexual orientation gender identity genetic information or any other protected group status we are also committed to providing reasonable accommodations for qualified individuals with disabilities and disabled veterans in our job application procedures if you need assistance or an accommodation due to a disability you may contact us or have one of your representatives contact us at bpusapplicationassis@bp com or by telephone at 281 366 1999 read the equal employment opportunity is the law poster and the poster supplement - for more information about equal employment opportunities ( spanish version )bp is an equal employment opportunity and affirmative action employer view our policy statement **essential experience and job requirements:**+ 15+ years of relevant work experience in it data & analytics with 10+ years of data engineering and or data architect experience+ experience in any big data technologies - hadoop emr amazon redshift azure cosmosdb azure data lake aws dynamodb or advanced analytics tools+ software development lifecycle knowledge+ stream processing services such as kafka aws kinesis apache storm spark streaming azure event hub etc + demonstrated experience working in large-scale data environments which included real-time and batch processing requirements+ knowledge of tqm qa practices+ strong understanding of etl processing with large data stores+ strong data modeling skills (relational dimensional and flattened)+ strong analytical and sql skills with attention to detail+ validated experience with 1 or more non-sql languages like python or java+ experience working in a hybrid environment with multiple datacenters multiple public cloud and saas providers + knowledge of and experience implementing complex applications e g service oriented architectures or distributed graphics processing engines + ability to work with multiple external teams and accomplish shared goals through the building consensus+ conceptual skills + decision making + informing others + strong communication (written verbal) and collaboration skills+ consulting negotiation and relationship skills+ problem solving skills+ enthusiastic high-energy individual self-motivated people-oriented and self-directed+ must be an intelligent articulate and persuasive leader who can serve as an effective member of the data & analytics team who can communicate concepts to technical & nontechnical colleagues + must be able to maintain focus on achieving results whilst being patient and pragmatic the st louis fed is one of 12 reserve banks serving all or parts of missouri illinois indiana kentucky tennessee mississippi and arkansas with branches in little rock louisville and memphis the st louis fed’s most critical functions include: promoting stable prices and economic growth fostering a sound financial system providing payment services to financial institutions supporting the u s treasury's financial operations and advancing economic education community development and fair access to credit ***overview*treasury technology services (tts) is the embedded information technology organization for the treasury division at frb st louis tts develops and supports applications ranging from off-the-shelf cash management packages to enterprise-level custom accounting systems tts is made up of two organizations: engineering and shared services the engineering team consists of several application development teams each of which are dedicated to specific treasury business lines the shared services team consists of the project management office the architecture office the data management team and the dev ops teams tts is seeking a*lead data engineer*to develop effective and efficient database release deliverables into all development test and production environments the lead data engineer will assure the database release along with its associated software satisfies the business requirements of our customer for systems that are critical to the infrastructure of the department of the treasury the lead data engineer also will design well coded database objects such as stored procedures oracle packages functions and views as well as database pl sql scripts used for data migration fixes transformations and test data generation; and prepares tests and maintains database release code throughout the project’s software development life cycle ***responsibilities* * develop enhance and troubleshoot complex data engineering and data integration platforms developed in oracle informatica and scripting language like shell and python * resolve problems quickly and effectively * effectively collaborate with and support our software developers database architects data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects * develop software in distributed system cloud environment using big data technologies e g hadoop rds oracle cassandra nosql etc * assess and support peer code reviews database defect support as well as provides backup production support as needed * work closely with devops to package releases and ensure the process follows change management * apply knowledge of tools like github ancible * implement large-scale projects in both waterfall and agile; and estimate and scope work *qualifications* * bachelor’s degree with a major or specialized courses in information technology or commensurate experience * 5 years related experience * experience with informatica powercenter * experience designing and delivering cross functional custom reporting solutions * experience with relational sql and nosql databases oracle * experience with aws cloud services: ec2 s3 rds redshift * must be self-directed and comfortable supporting the data needs of multiple teams systems and products * must be resourceful and creative in identifying ways to mitigate issues and risks to avoid project delays * extremely effective written and verbal communication skills * travel (5%) * us citizen requiredranked as a top workplace the federal reserve bank of st louis is committed to building an inclusive workplace where employees’ diversity—in age gender race and ethnicity sexual orientation gender identity or expression disability as well as cultural traditions religion life experiences education and socioeconomic backgrounds—are recognized as a strength embracing our diversity encourages employees to bring their valued perspectives to the table when generating ideas and solving problems and promotes an environment where innovation and excellence thrive learn moreaboutthe bank and its culture; check out ourcareers site the federal reserve bank of st louis is an equal opportunity employer **organization:** **federal reserve bank of st louis* **title:** *lead data engineer w informatica* **location:** *mo-st louis* **requisition id:** *254682* our client is the leading provider of personalized customer experience solutions for over 300 of the world's most well-known brands on client is looking for hands-on engineer to lead the architecture and implementation of next generation personalization solutions on this role you will be working with an excellent team of front-end engineers platform engineers and our data sciences team to bring these web personalization solutions to our customers these range from advanced recommendation systems to search augmentation and other consumer and business personalization solutions you will be delivering powerful but simple and useable solutions leveraging could based innovative machine learning algorithms built on big data technologies the company is looking to hire a data-savvy ruby data engineer to build and maintain a robust scalable and sustainable data warehouse that will aid the company in its quest to shake up the social commerce industry surrounded by a diverse group of fun-yet-dedicated individuals from agile project leads and talented developers to fashion writers and photographers this person will tackle highly-scalable systems with complex data models and a large amount of transactional data so if you’re eager to get your hands dirty at the scripting level (ruby) are a sql ninja and have familiarity with enterprise toolkits let’s talk!what you’ll be doing: you’ll come into this position at a critical junction—the foundations have been laid for the creation data warehouse that will have a company-wide impact—and you’ll stand at the helm of its ground-breaking development working side-by-side with members of the analytics team and pair programming with other top-notch engineers in our agile test-driven development culture you’ll create new etl and data management tools in ruby using your familiarity with data warehousing best practices as well as your ability to think outside of the box you’ll participate in the analysis design requirements gathering functional technical specifications deployment and testing of all matters pertaining to the enterprise data warehouse the insight provided by your work will have a direct influence on revenue and customer interaction and what you create will be an example to others in the industry responsibilities: • lead the company's data warehouse and etl efforts within our agile workflow including training and pairing with other engineers • develop and maintain our custom ruby etl tools and data warehouse platform • integrate with vendors warehouse tools and other third parties in a test-driven way • recommend standards and methodology for creation capture maintenance and integration of metadata • work with our data scientists and analysts to implement custom algorithms for product recommendations inventory modeling and more into the data warehouse's processes • provide expertise and leadership on decision support technology and data warehousing throughout the enterprise • work closely with the analytics team to gather technical requirements • design develop and maintain the logical and physical dimensional data model of the enterprise data warehouse • implement kpi dashboards dynamic reports olap cubes ad-hoc reporting scheduling monitoring and automationrequirements: • proficiency in object-oriented programming specifically ruby and ruby on rails • experience in data warehousing including dimensional modeling concepts • strong skills in operational data store and data warehouse architecture • expert knowledge of relational databases and sql • proficiency in unix linux os x • strong analytical and problem-solving skills and high attention to detail • the ability to learn new paradigms tools and processes quickly • flexibility in a fast-paced rapidly changing exciting work environment • the ability to prioritize self-manage and seek help when necessary • excellent written and oral communication skillsnice to haves: • understanding of the principles of data management data governance metadata management data design and integration • leadership experience within a business intelligence or analytics group • e-commerce experience • experience working in an agile development environment • solid grasp of data mining concepts and techniques • a passion for e-commerce crowdsourcing and fashion with more than 60million subscribers and counting spotify is looking for an experienced data solutions lead to work closely with our premium business teams in stockholm to drive cross-functional data initiatives *please note- this role will require travel to stockholm sweden 1 week per month*what you'll dounderstand the strategy priorities and pain points of our premium team and bring that deep understanding to the analytics community at spotifymaximize the cross-functional impact of our research and analytics platforms to drive new subscriptions and retention for spotifypartner with functional leads to identify and evaluate technology investment opportunities that increase the scale and efficiency of subscriber acquisitionconceive of new features or components to existing analytics projects or platforms that would advance the work of our premium teamconsistently provide business pov into our research and platform projectsdesign new solutions (spanning research platforms culture) to advance the work of the function and or of analysis within the function - consult with other solutions leads to explore the potential for common solutionsdesign and evolve a data-first strategy for the premium team advise on how the function becomes data-first concept new areas in which data and analysis could drive performance impact and recommend idea and solutions for how they can achieve that impactdraw on best practices learned from interaction with other companies scholarship conferences etc devise new data-first strategies practices and ideas as part of the solutions teamcommunicate what we solve and build in data to the function ensuring that outputs land successfully and achieve their intended impact follow that impact and share findings with data squads to inform our 'tweak it' phases who you areminimum 5 years of experience in acquisition marketing offer strategy retention analytics payments or subscriber analytics roles at fortune 500 companiesexperience with marketing technology platforms strongly preferred (dmp crm mobile tracking platforms attribution platforms) proficient in sql and familiar with big data technologies a minimum of 3 years of hands-on data experience in analytics a minimum of 3 years of experience working cross-functionally with engineers researchers data scientists and business stakeholders we are proud to foster a workplace free from discrimination we strongly believe that diversity of experience perspectives and background will lead to a better environment for our employees and a better product for our users and our creators this is something we value deeply and we encourage everyone to come be a part of changing the way the world listens to music funding circle brings together small businesses and investors in a way that is truly revolutionary our mission is to foster an environment where small business can thrive our online platform provides a marketplace where investors receive better returns and small businesses find lower rates the driving force behind our product is our engineering team; we are building elegant sustainable and scalable infrastructure on a global scale and we want you to be a part of it!our mission: to build a better financial world prospectus:would you describe yourself as a data fanatic? do you have a passion for integrating and analysing data sources to help your company make better decisions faster? if you answer yes to these questions then we're looking for you to join our team! funding circle’s data team are looking for a senior data engineer to help build and transform fc uk’s data warehouse the right person will be:an enthusiast - you understand what a good data warehouse can bring to a business and what it takes to build one a communicator - you can communicate effectively to engineers as well as business users a builder - you are experienced in all stages of warehouse development from data modelling to building out comprehensive etl a pathfinder - you are interested in solving problems today while helping shape strategic direction for the months and years ahead a thinker - you have an inquisitive mindset and the desire and ability to turn business requirements into working software an owner - you’ll take pride and ownership in the quality of the work you and the team produce the uk data team are part of a global team tasked with building and maintaining the data platform that supports analytics and reporting for fc our stakeholders are from every area of the business - from teams tracking performance through bi specialists to data scientists we have identified data as one of our key strategic assets and so this role is one in which you can make a visible impact on the business the ideal candidate will have:7+ years in a development and data engineering role preferably in tech consulting or finance extensive hands-on experience with sql (we use postgres) strong programming skills (python java ruby scala clojure; we love them all) familiarity with aws stack including emr lambda & ec2 proficiency using large data sets and relational and dimensional modeling expertise in a unix environment experience with messaging and streaming platforms (kafka rabbitmq jms etc ) exposure to big data nosql systems and the issues that arise from working with large data sets a self-starter attitude with an enthusiasm to work in a fast-paced team-oriented start-up environment bonus points for:engineering cs or finance degree experience in virtualized environments (mesos marathon chronos docker etc ) skills working with structured and unstructured data sets microservice architecture development experience why join us?happy employees are productive employees that’s why we offer a hearty benefits package from learning and development and commuter stipends to a competitive salary equity and health benefits we’ve got you covered! that being said have you heard about what we're doing?! our mission is what really motivates us to come to work each day:we're supporting small business the engine of economic growth we're helping facilitate higher yields for investors and lower interest rates for borrowers we can fund loans extremely quickly all online!we have a clear competitive advantage globally in areas like domain expertise and regulatory processes pursuant to the san francisco fair chance ordinance we will consider for employment qualified applicants with arrest and conviction records funding circle provides equal employment opportunity to all individuals regardless of their race age creed color religion national origin or ancestry sex gender disability veteran status genetic information sexual orientation gender identity or expression pregnancy or any other characteristic protected by state federal or local law the data modeler focuses on activities related to the creation modification and population of sql server databases archiving a wide variety of client legacy data to mediquant products incumbent must be knowledgeable of mediquant software products and how these work in conjunction with created databases job duties and essential functions a qualified individual must be able to perform the essential functions of the job as listed with or without accommodation (an asterisk (*) identifies an essential function)translates functional product requirements into technical designs (*)configures and tests objectscreates weekly detailed status reports on project metricscreates and delivers training to the end users trouble shooting and problem solving (*) – identifies problems and pro-actively intervenes to mitigate or eliminate potential for negative impact effectively checks work for accuracy understanding where opportunities for errors exist takes ownership to ensure own work is error-free determines when an immediate fix is the best course or when it is necessary to hand the situation to identified source for root cause resolution uses experience and expertise sufficient to intrinsically know the data and manage the standard validation process organization and time management (*) – handles multiple projects and prioritizes deadlines identifies and utilizes all resources available when priorities conflict or when external challenges are lining up against the deadline must be able to clearly communicate the issue and tenacious in finding resolution handles a fluctuating workload and is able to prioritize during times of peak demand and conflicting priorities knows when to seek assistance to ensure deadlines are met and a quality product is delivered communicate clearly with internal customers (*) – understands the importance of keeping all team members apprised of the project status applies technical knowledge and seeks to fully understand the client’s expectations by asking questions from identified project resources works as the migration expert on the implementation team along with sales and support liaise with application developers and the project management office to create methodology for meeting client needs competenciescollaboration skillscommunication proficiencyorganizational skillspresentation skillsproblem solving analysistechnical capacitythoroughnesstime managementqualificationsrequired education and experiencebachelor’s degree in computer engineering computer systems computer science engineering or software engineering or equivalent number of years of experience plus three years of data modeling design architecture administration and development experience preferred high school diploma or ged plus 3-5 years of applicable experience required strong sql scripting skills and experience parsing and manipulating file dataproven healthcare it clinical or revenue cycle experienceexperience with full lifecycle implementations (from analysis through deployment and support) highly skilled in process chain design data security object migration to different environments extensive experience with design documentation intermediate-level knowledge of sql (*) – creates modifies and populates sql server databases in accordance with the particular environments and parameters of the system that is being migrated:utilizes knowledge of tables and normalization to determine where data should reside determines an accurate and efficient approach to writing and editing sql scripts used to load client data uses sql to diagnose and solve data integrity issues preferred education and experience:revenue cycle experience preferredprevious experience in hospital & healthcare computer software industries preferredability to discern and clearly understand client's needs through effective communication and interaction ability to solve problems using logical thought processes and devising creative solutions high attention to accuracy and detail ability and desire to work in a strong team culture knowledge with hipaa regulations involving phi and hitechstrong customer service and information technology acumensupervisory responsibilitythe data modeler role has no supervisory responsibilities but does serve as a coach and mentor for other dms work environmentthis job operates in a professional office environment this role routinely uses standard office equipment such as computers phones printers and photocopiers physical demandsthe physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job with or without accommodation this is largely a sedentary role; however must be able to stand walk occasionally kneel stoop or crouch; use hands to key information touch feel and type must be able to communicate effectively including hearing listening seeing and speaking clearly must be able to analyze memorize problem solve read and perform simple to complex math may frequently life 10 lbs position type and expected hours of workthis is a full-time position days and hours of work are monday through friday; hours of the workday are flexible yet require the employee to work a minimum of 40 hours per week work authorization security clearance in compliance with federal employment laws mediquant will verify the identity and employment authorization of each person hired aap eeo statement mediquant inc is an equal opportunity employer other dutiesplease note this job description is not designed to cover or contain a comprehensive listing of activities duties or responsibilities that are required of the employee for this job duties responsibilities and activities may change at any time with or without notice travellittle to no travel is expected for this position data architect - sql server oltp and olap systems etl processes position summary seeking an talented and motivated data architect and visionary senior leader to integrate enhance and revolutionize our diverse claims data assets to support our mission of reimagining and reinventing claims decision-making for our clients reporting to the head of the analytics center of excellence the vp of data warehousing claims data officer will be responsible for the vision and strategy for all data management activities across the claims businesses from establishing the overall direction architecture and technology for our future state data architecture to developing and managing processes to improve institutional claims data through standards integration protection and governance this role has uniquely broad responsibilities and scope this is a challenging role with a great deal of autonomy it is a growth role for the right individual and you will be working with a fun passionate and dedicated team a team that has the drive and mind set of a start up with the backing of a major corporation responsibilities develop data models using handwritten sql and etl tools develop both the logical and physical data models in conjunction with business requirements communicate with business areas and other developers provide support and fix issues in the packages design build and deploy effective ssis package requirements advanced sql including performance tuning sql server preferred (3+ years) deep understanding of oltp and olap systems (3+ years) dimensional modeling experience in at least one of the following methods - kimball data vault or inmon (3+ years) experience designing and building complete etl processes (2+ years) deep understanding of data analytics for the purpose of supporting business and data science efforts experience on big data platforms such as aws is a big plus experience using pentaho a big plus experience using bi tools is a big plus nice to have: insurance industry domain knowledge – particularly auto cpcu idma pmp this position is a full time employee position with our client this position is located in jersey city nj area interested? please send your resume to: jacqueline tagorda at jtagorda@galentechsolutions com who we are galen technology solutions is a full service technology consulting firm with offices in new york city and florham park nj galen has positioned itself as a trusted leader in providing technology services to fortune 100 companies in all major markets since 1998 our staff has been providing a diverse offering of technology staffing and solution services to leading financial healthcare manufacturing management consulting and technology firms in the new york metropolitan area galen’s overall success can be directly attributed to our people and principles our creed of hard work integrity and professionalism has established us as a leader and primary supplier in the industry we have an experienced staff that has expertise in technology staffing placement and end-to-end technology solutions we are fully dedicated to identifying your specific needs and developing the necessary strategies to effectively recruit hire and deliver top talent our established relationships and proven methodologies have provided us with a strong track record of delivering results for our clients http: www galentechsolutions com " target="_blank" http: www galentechsolutions com primary recruiter: jacqueline tagorda - provided by dice architecture bi tools business requirements consulting data architecture http management modeling olap oltp pmp recruiter sql sql server job title: senior data engineer architectwhy work here as a senior data engineer architect?our client is a fun trendy and rapidly growing tech company in the fashion industry!this company features multiple brands promoted by celebrity personalities their workspace features expansive spaces with multiple photo studios an on-site cafeteria and coffee shop and outdoor spaces as a senior data engineer architect you will:oversee guide and direct the data pipeline and data management platformsdrive bi solutions strategy and implementationmonitor refine and maintain system performance and provide statistical reportingwork with data science and project product management teams in understanding business requirementsinterface with product managers senior architects engineering team business users etc work with data modeler in developing detailed etl specifications based on requirementswork closely with data architect and data warehouse architect in developing and modeling provide detailed technical design documentationparticipate in cross-functional meetings to review requirements use stories and assist in fit gap analysisdevelop workflows to extract transform clean and move data from the business systems into an mpp data warehouseutilize both in-house and third-party data sources to develop maintain etl footprintprofile and understand source data including structured and semi-structured web activity dataprovide stakeholders operational support on data pipelinestriage identify and fix scaling challengesdiagnose issues perform root cause analysis and recommend course-correctionpropose improvements in data reliability efficiency and quality where appropriatementor train and conduct knowledge transfer sessions to junior developerswhat gets you the job?6+ years’ data engineering experience with high performance big data platforms4+ years’ data warehousing experience – enterprise-level cloud-based development effortsetl development experience using informatica or talend – airflow and luigi is a plus5+ years’ coding experience in languages such as python java ruby phpexperience handling large amounts of raw data including web logs click stream and data feeds2+ years’ experience with a big data cluster i e hadoop ecosystem on hdp cloudera aws2+ years’ experience with spark kafka hadoop and distributed datastoresexperience developing elt pipelines using teradata vertica redshift etc experience implementing streaming pipelinesexperience with source code management using git or subversion and release processesunderstanding of data warehouse architecturesstrong metadata modeling experienceexperience with scalable systems in a load balanced environmentability to conduct load testsscala experience is a plusecommerce or retail experience is a huge a plus - provided by dice microstrategy snowflake redshift (dyotta) cloud data modeling etl (ecomm) join an entrepreneurial team of talented developers in driving the future of a suite of industry-leading pharmaceutical software products iqvia ctos is in immediate need of an exceptional data architect who can implement creative solutions to enhancements integrations and performance-boosting for existing products you will own significant portions of the product and will have significant influence on our strategy by helping define and build the next wave of product features and system architecture the ideal candidate is passionate about technology data analytics and bringing insights to users helping us scale our application across an expanding customer base and solve interesting problems **responsibilities:**+ serve as a top level technical expert on large complex projects + analyze clinical medical physician patient data sources working with smes to understand user needs and architect data integration and maintenance strategies + defines deliverables costs and benefits and roi+ design develop test deploy and document core application and analytic capabilities of business performance optimization software product + work with data scientists smes and engineering to build deploy and test machine learning algorithms + collaborate with architects and other engineers to drive and build innovative software solutions defining best practices and methodologies + collaborate with cross functional teams for product releases + may have duties instructing directing and checking the work of other system engineering personnel + may have quality assurance review responsibilities **requirements:**+ bachelors of computer science engineering or related field + 8+ years of software development experience with strong understanding of oracle dbms sql and pl sql + 3+ years hands-on experience with big data technologies like hadoop mongodb spark + hands-on experience with data warehousing and master data management + firm understanding of distributed systems data federation and virtualization + experience with data acquisition analysis and incremental integration from various sources + experience must include building enterprise scale applications in analytic or business intelligence data warehousing area and optimizing large data loads & data analysis for both near-real-time and batch processes + exposure to information retrieval statistics or machine learning tools and technologies + good knowledge of computer science fundamentals: data structures algorithms complexity analysis + good understanding of sdlc and experience in end to end feature development and communicating across cross functional teams + excellent communications skills+ highly innovative flexible and self-directed but also a co-operative team player nice to have:+ experience building customer facing highly scalable and interactive analytic web application + experience with search frameworks like solr elasticsearch + experience with graph databases like neo4j basic understanding of core java and jdbc**announcing ims health and quintiles are now iqvia? ****join us on our exciting journey!**iqvia? is the human data science company? focused on using data and science to help healthcare clients find better solutions for their patients formed through the merger of ims health and quintiles iqvia offers a broad range of solutions that harness advances in healthcare information technology analytics and human ingenuity to drive healthcare forward **did you know?**we know that meaningful results require not only the right approach but also **the right people** regardless of your role we invite you to reimagine healthcare with us you will have the opportunity to play an important part in helping our clients drive healthcare forward and ultimately improve human health outcomes whatever your career goals we are here to ensure you get there!we invite you to join iqvia? basic purpose:the enterprise data & analytics group at client is looking for a big data developer to be a part of a team that designs and develops big data solutions that meet business objectives this is an exciting opportunity to work for a family-owned company that continues to experience growth and get in on the ground floor to help build the company’s big data practice the ideal candidate has a deep technical knowledge of the hadoop stack and possesses a desire to push the business further through innovation this role requires a close partnership with the data science analyst community as well as various it teams to ensure requirements are met and solutions are supportable and scalable major responsibilities:design and implement data ingestion techniques for real time and batch processes for a variety of sources into hadoop ecosystems and hdfs clustersvisualize and report data findings creatively in a variety of visual formats that provide insights to the organizationknowledge of data master data and metadata related standards processes and technologydefine and document architecture roadmaps and standardsdrive use case analysis and solution design around activities focused on determining how to best meet customer requirements within the tools of the ecosystemensure scalability and high availability fault tolerance and elasticity within big data ecosystemarchitect and develop elt and etl solutions focused on moving data from highly diverse data landscape into a centralized data lake; also architect solutions to acquire semi unstructured data sources such as sensors machine logs click streams etc manage all activities centered on obtaining data and loading into an enterprise data lakeserve as an expert in efficient etl data quality and data consolidationstay current with vendor product roadmaps and make recommendations for adoptionmaintain a customer-focused attitudeeducation and requirements:education:bachelor’s degree or equivalent in information technology computer sciences or computer engineeringexperience:8 years it experience3+ years of experience building large-scale data solutions involving data architecture data integration business intelligence and data analytics1+ year of experience working on large scale big data projectsdeep technical knowledge of most components contained within the hadoop ecosystem (mapreduce hdfs yarn hive hbase sqoop etc ) preferable with hortonworks distributionexperience building streaming analytics solutions using nifi storm or other similar technologiesunderstanding of statistical and predictive modeling concepts a plusstrong java j2ee experienceexperience with visualization toolsexperience with rdbms platforms such as sql server and in-memory columnar storage such as hana - provided by dice hadoop etl bi big data java j2ee sql development work for company's data management solutions the bw data team manages the data requirements for corporate needs (finance hr recruiting legal and other corporate needs) and distributes it other part of org including the trading and asset management (called investment engine) bw also has data science team that is working on the next gen ai software to automate their internal decision making process i have already positioned shareinsights platform and they’re open to discussing it further role:• manage and execute corporate data solution delivery projects from design through implementation• visualize and build technical designs and proof of concepts in line with business requirements• synthesize goals to outcomes at the project level and provide updates to solution delivery lead• be a technology thought partner for the solution delivery lead on the execution and evolution of the solution delivery pillar• keep abreast with industry trends and leverage expertise to influence the adoption of new technologies in the data management area• accurately perceive understand and solve problems• be open to providing and receiving honest feedback on a daily basis consistent with culturekey skills:data architecture etl data integration data governance data quality and business intelligencedesired skills:• 10+ years of strong implementation experience managing and delivering complex data initiatives and deep technical knowledge in the following areas: data architecture etl data integration data governance data quality and business intelligence• experience with the methodology and conceptualization design and hands-on implementations of new and existing systems mdm metadata data quality data warehouse and production architectures• in depth experience and knowledge in etl elt performance tuning and designing optimized integration systems is required• in depth experience in running data initiatives using agile methodology• hands on experience with relational databases integration platforms (such as informatic) bi tools(tableau) cicd tools code repositories work load automation tools•sustained track record of delivering impact in a high-intensity results-oriented environment• collaborative team player able to give receive open and direct feedback• bachelor’s degree in a technology or business disciplineexperience level:• 10+ years of strong implementation experience managing and delivering complex data initiatives and deep technical knowledge in the following areas: data architecture etl data integration data governance data quality and business intelligence• experience with the methodology and conceptualization design and hands-on implementations of new and existing systems mdm metadata data quality data warehouse and production architectures• in depth experience and knowledge in etl elt performance tuning and designing optimized integration systems is required thank you geeta | business development managerinfoway group llc direct: 773-888-4116 - provided by dice data architecture etl data integration data governance data quality and business intelligence informatica tableau sql server position: azure big data architectlocation: buffalo nyrequired skills:key responsibilities* experience with azure cloud* extensive experience with data architecture* experience with azure data lake azure sql data warehouse data catalog* hands on experience with azure hdinsight spark eventhub* azure virtual machines blob storage azure sql database storsimple azure dns virtual network documentdb redis cache azure app service* experience with big data technology stack: hadoop spark hive mapr* hands on net technologies stack* strength and background in infrastructure or hpc (high performance computing) is preferable * experience with one of the db servers is an added advantage* work closely with clients both in the business domain and with technical team* work with the team to architect design and develop quality business intelligence deliverables * communicate the status of development quality operations and system performance to all stakeholders requirements* bachelor*s degree in computer science computer information system data science or similar discipline * 3+ years of proven experience building and operating business intelligence solutions for consumer or retail analysis * knowledgeable in querying data stored using a broad range of data modeling techniques including third normal form dimensional and data vault modeling on microsoft sql server-based technologies * experience designing and building cloud native applications as well as azure cloud migration experience thanks & regardssarath | terminal contactsmail id: sarath@terminalcontacts comdirect: (813) 600 5819www terminalcontacts com - provided by dice azure spark hadoop hive mapr ***** must be able to work on w 2 salary basis *** must be able to work on-site *******_nissan is expanding and currently seeks a_****big data engineer hadoop****_for direct hire!!_****in addition to a competitive salary & bonus this position offers full employee benefits!!****leverage your** **big data – data warehouse and etl skills****to take your career at nissan to****the "next level" for** **_2018 and beyond!_****_nissan in franklin tennessee seeks the following:_****title:** **big** **data engineer****requirements** **:**+ recommend develop & support big data and analytical solutions+ leverage alliance standard technologies & plan contribute to usage of any emerging technologies+ translate complex functional and technical requirements into detailed architecture design and high performing software solutions+ work on multiple projects concurrently performing user story analysis detailed design and development of software applications in the big data environment and integrated systems + code test and implement data & analytics solutions in alignment with the project schedule(s) create data flow diagrams and other living documents to support the data solutions while also working with other developers to ensure consistency + expand nissan’s data product catalog in big data environment and expand the data platform capabilities to solve new data problems and challenges+ perform logical & physical database design for big data solutions construct appropriate data flow and follow nissan standards+ ensure effective automated processes high data availability and operationalization of all products+ create and maintain the relevant enterprise data catalog components metadata including the initial data intake process for bde tenants+ recommend and advise on all big data components roadmap and emerging opportunities+ contribute to standards development & data governance**minimum:**+ strong knowledge of data management concepts of data warehousing etl data integration etc + experience with agile scrum or other rapid application development methods + demonstrable experience with object-oriented design coding and testing patterns as well as experience in engineering (commercial or open source) software platforms and large-scale data infrastructures + strong experience developing & implementing software & solutions in the enterprise linux or unix environment+ strong understanding with various enterprise security solutions such as ldap and or kerberos+ network configuration devices protocols speeds and optimizations+ java ecosystem and enterprise offerings+ ability to understand big data use-cases and recommend standard design patterns commonly used in hadoop-based deployments + bachelor's degree in computer science computer engineering other technical discipline or equivalent work experience + 5-10 years of professional is it experience overall+ 3-5 years of large-scale software development and integration via data engineering data science or software engineering with a concentration on big data + direct hands-on design development deployment & support of software solutions with a recent emphasis on hadoop solutions + hortonworks hdp certified developer credential+ experience designing data queries against data in the hdfs environment (hive & hbase)+ seasoned developer using different programming languages (ex java) and scripting tools such as bash shell scripts python pyspark and or perl+ experience with r+ significant previous work writing to network-based apis preferably rest json or xml soap+ solid background in database design modeling and data integration on a variety of relational databases (db2 oracle sqlserver postgres etc ) and nosql databases + advanced knowledge & experience with all aspects of hadoop ecosystem (pig hive oozie kafka hue spark zeppelin atlas solr llap etc )messaging technologies (mq activemq etc )nifi preferable **educational requirements:**+ bachelor’s degree in computer science decision information systems mis or equivalent experiencewe **reward** inspiration and motivation the benefits of working for **nissan** include but are not limited to:+ onsite fitness center+ onsite starbucks+ onsite dry cleaner and shoe repair+ car allowance on nissan infiniti vehicles+ vehicle lease purchase discount program for employee’s and their families+ onsite medical care representative+ full benefits medical dental prescription optical 401k (up to 3% match) company annual retirement program and much more+ opportunity for growth within the organization+ **salary: open (based upon experience and qualifications)** + **at this time we are not in the position to offer any type of sponsorships or visa transfers for this position ** **important information:****this position is recruited for by a remote kelly office not your local kelly branch to be considered for this position you must apply now to submit your resumes if you have questions about the position you may contact the recruiter recruiting for this position by email ****email:** **petf054@kellyservices com****why kelly** **®** **?**with kelly you’ll have direct connections to leading it organizations in the best companies around the globe—offering you the chance to work on some of today’s most intriguing innovative and high-visibility projects in a field where change is the only constant our connections and opportunities will help you take your career exactly where you want to go we work with 95 of the fortune 100™ companies and more than 9 000 it hiring managers turn to us each year to access the best talent: people like you last year we found 10 000 opportunities for it professionals let us help advance your career today **aboutkelly services** **®**as a workforce advocate for over 70 years we are proud to have a role in managing employment opportunities for more than one million workers around the globe we employ 550 000 of these individuals directly with the remaining workers engaged through our talent supply chain network of supplier partners revenue in 2015 was $5 5 billion visit kellyservices com and connect with us on facebook linkedin and twitter kelly services is an equal opportunity employer including but not limited to minorities females individuals with disabilities protected veterans sexual orientation gender identity and is committed to employing a diverse workforce equal employment opportunity is the law at https: www dol gov ofccp regs compliance posters ofccpost htm slait consulting is currently seeking a data solutions architect for our client in herndon va summary: the data solutions architect is a senior member of the data management team with a focus on design development and implementation of advanced analytics to identify and exploit insights with positive business impact responsibilities: * improve the performance data quality and data security of the data warehouse and data mart repositories * envision the future of data warehouse and data marts and think of innovative ways that data science can be applied to benefit the business * contribute to the delivery of major business solutions * analyze large data sets to develop custom models and algorithms to drive business solutions * build large data sets from multiple sources in order to develop and train algorithms for predicting future data characteristics and provide the appropriate recommendations in real time * conduct advanced statistical analysis to determine trends and significant data relationships * participate in projects from a technology perspective ensuring that software development life cycle activities are consistent with the direction set by accepted best practices and internal standards * design and develop data warehouse applications and will assist project teams and production support staff in building new data warehouse capabilities and resolving data warehouse related issues * design and implement efficient and effective methods and processes to move data between applications and the data repositories using odi and other tools * work on highly technical initiatives including but not limited to: research tool evaluation documenting patterns and standards technology evaluation and recommendation developing shared libraries and resources and mentoring developers through hands-on assistance training and some technical development * works with application architects data architects dbas and business partners to provide recommendations and best practices for the data warehouse data models and data services required to support strategic initiatives qualifications: * bachelor’s degree in computer engineering computer science or related field a combination of education and experience including military service will also be considered * 10 years of hands-on technical work with database technologies applications and experience with data modeling database design * 4 years of experience in data mining machine learning * demonstrated experience with: data warehouse database and enterprise system architecture business intelligence and data warehouse concepts etl concepts and tools developing and applying advanced quantitative techniques toward solving business and information technology problems using oracle microsoft sql server and statistical software packages (such as sas stata and or r) to create analytic datasets from very large complex relational databases developing custom models and data mining * understanding of data storage technologies: relational databases sql xml json etc * knowledge with statistical research techniques including clustering and segmentation * strong programming ability including experience programming in one or more programming languages * demonstrated strong negotiation problem-solving and analytical skills * excellent communication and interpersonal skills and the ability to communicate effectively with third parties and internal staff at all levels of the organization * live within a commutable distance of herndon va desired skills: * experience working with statistics artificial intelligence predictive analytics machine learning and statistical modeling* oracle data integrator experience * knowledge of aws azure google cloud * experience working in an agile environment physical requirements: * use of a computer terminal and or laptop computer for 8 or more hours a day * frequently required to sit for 7 or more hours per day in close proximity to others in an open office environment * use of a copy machine fax machine and telephone * occasionally required to use hands and fingers to operate handle and reach * vision abilities include close vision and the ability to adjust focus * travel via car train and airplane when needed why slait?we have been growing since 1990 with offices in virginia and raleigh nc for over twenty-eight years we have delivered customized creative it solutions for customers in the commercial and state and local government sectors *staff augmentation *managed services *security solutions *it consulting thank you for your consideration please submit your resume today! visit us at www slaitconsulting com **must be able to work for any employer in the united states no visa sponsorship ** slait consulting is an equal opportunity employer - provided by dice database sql server oracle **these bi data engineers can be either senior or principal data engineer level:** the bi team size is 20 people and we focus mainly on data engineering (data science (machine learning implementation for business projects) data replication (for bi and other app teams) data warehousing (for reporting as well adhoc data analysis) and bi and data science platform our tech stack platform is netezza informatica v9 5 1 erwin v9 5 python v3 aws redshift s3 kinesis and the must-have skill sets for these positions include: sql informatica (power exchange cdc and power center) netezza (also if they know other skills listed in job description it will be great a “nice-to-have”) **general description (senior-level):** the senior data engineer is responsible for defining designing and implementing data pipelines data engineering solutions and analytics components to support strategic initiatives and ongoing business processes for client this role will develop solutions that leverage integrate with and expand upon the company's business intelligence (bi) platform technologies and patterns this position requires a deep understanding of data consumers and how they use data to drive business performance and achieve their goals the senior data engineer will also act as a mentor to junior developers to help maintain a sustainable analytics ecosystem **essential duties and responsibilities (senior-level)**+ 50% - deliver and drive efforts in developing bi data solutions focused on data engineering and analytics using python machine learning rdbms r sql data modeling java unix scripting aws microstrategy power bi and mpp databases + 10% - troubleshoot technical issues in existing processes and current development work solicit assistance from other roles and groups and drive resolution to ensure the integrity of platform quality and project timelines + 10% - understand and improve shared standard patterns templates and artifacts for bi and data warehouse platform architecture data science development approaches data models and new technology adoption and rollout + 10% - collaborate with upstream and downstream it and business teams to identify and document use cases performance and capability requirements and criteria for successful solution delivery + 10% - mentor other team members on technical skills best practices problem solving approaches and standard patterns used at client + 5% - proactively generalize and share technical approaches and best practices among other developers and simplify and communicate completed work to broader audiences across the company + 5% - help support data consumers to ensure they have reliable access to trusted data this includes periodic responsibility for 24x7 on call production support **supervisory responsibilities (senior-level):**+ lead the overall delivery plan and execution across supporting team members for assigned project work+ lead daily stand-up meetings within immediate work group and proactively communicate status and progress of project work+ mentor and coach teammates in business and technical knowledge and problem resolutions**decision-making responsibilities (senior-level):**+ recommend explain and defend solution design options in part or in whole+ advocate and recommend adoption of specific technology and business solutions+ identify evaluate and select appropriate resolution paths for process or system risks bugs errors etc with appropriate consideration of all downstream implications and effects for data consumers in both it and business groups**education experience (senior-level):**+ minimum required education: + master's degree in computer science mis engineering business or similar required+ minimum required experience: + minimum 2 years’ experience in delivering business solutions that rely on business intelligence data warehousing data science or similar domains+ desirable education experience: + 4 years hands-on experience in delivering solutions in at least two areas which include: data modeling data integration data analysis sql and 2-5 years hands-on experience in at least two areas which includes: data science big data r python and machine learning**technical competencies (senior-level)**+ **data engineering skills –**+ analyze summarize and characterize large or small data sets with varying degrees of fidelity or quality and identify and explain any insights or patterns within them+ understanding of core principles of data warehousing data science and machine learning+ data analytics and visualization using r and python (numpy pandas scipy scikit-learn tensorflow and keras)+ ability to work with mpp databases like **netezza** or **redshift** or **teradata** and data pipelines using informatica (power center and power exchange) and python data glue + being able to build logical data models physical data models and normalizing techniques with 3nf (using tools like **erwin** ) with proven experience in handling complex business scenario + experience in handling 1000+ lines complex sql codes to deliver complex data based solutions in both development and data analysis work + disciplined approach to testing software and data identifying data anomalies and correcting both data errors and their root causes+ familiarity with and working knowledge of dmbok (data management book of knowledge) core concepts (general data management governance architecture development security quality metadata management mdm etc)+ **other technical computer skills (senior-level)**+ ability to synthesize technology and business knowledge to deliver efficient solutions \+ create and share standards best practices documentation and reference examples for data warehouse integration etl systems and end user reporting**why kelly** **®** **?**with kelly you’ll have direct connections to leading it organizations in the best companies around the globe—offering you the chance to work on some of today’s most intriguing innovative and high-visibility projects in a field where change is the only constant our connections and opportunities will help you take your career exactly where you want to go we work with 95 of the fortune 100™ companies and more than 9 000 it hiring managers turn to us each year to access the best talent: people like you last year we found 10 000 opportunities for it professionals let us help advance your career today **aboutkelly services** **®**as a workforce advocate for over 70 years we are proud to have a role in managing employment opportunities for more than one million workers around the globe we employ 550 000 of these individuals directly with the remaining workers engaged through our talent supply chain network of supplier partners revenue in 2015 was $5 5 billion visit kellyservices com and connect with us on facebook linkedin and twitter kelly services is an equal opportunity employer including but not limited to minorities females individuals with disabilities protected veterans sexual orientation gender identity and is committed to employing a diverse workforce equal employment opportunity is the law at https: www dol gov ofccp regs compliance posters ofccpost htm about fresh gravity:founded in 2015 and rapidly expanding fresh gravity (www freshgravity com) is an exciting business and technology consulting company that is at the cutting edge of digital transformation we drive digital success for our clients by enabling them to adopt transformative technologies we provide a range of services: from data management data science & analytics to api management soa and artificial intelligence in a short time we have crafted an exceptional team who have delivered impactful projects for some of the largest corporations in the world we are on a mission to solve the most complex business problems for our clients using the most exciting new technologies and we are looking for top talent to join us in our quest fresh gravitys team members are authorities in their field but know how to have fun too were building an inspiring open organization youll take pride in we challenge ourselves to grow every day we create value for our clients and partners every day we promise rich opportunities for you to succeed to shine to exceed even your own expectations we are thoughtful we are engaged we are relentless we are fresh gravity fresh gravity is an equal opportunity employer job summary:fresh gravity is seeking a strong cloudera hadoop administrator to join our fast-growing team for a well-known client key responsibilities:analyze design create and implement big data infrastructures including access methods device allocations validation checks organization and securitydesigns data models logical and physical infrastructure designs etc support development and production deploymentsinstall upgrade and test complex big data deploymentsassist in system planning scheduling and implementation develop and implement recovery plans and procedures integrate data and bi tools with hadooprequirementsbachelors degree in computer science or a related discipline with at least five years of overall experience and a minimum of three years of experience as a hadoop engineer or the equivalent in education and work experiencestrong background and experience with administration of apache hadoop (cloudera distribution a plus)experience integrating bi and data tools with hadoophbase dba experience hands-on experience with database replication and scaling well versed with the design installation and maintenance of highly available systems (including monitoring security backup and performance tuning) highly proficient in linux (rhel) proficient in query and scripting languages like pig hive and jaql strong background and demonstrated experience in configuration automation (chef puppet) must possess good analytics and problem-solving skills must be willing to work flexible hoursnature of engagement:contract preferred but willing to consider contract-to-hireduration: 4-6 months with high probability of extensionlocal candidates (or candidates willing to relocate) only! start time frame: march 19 2018 data engineering leadstamford ctregular full-timeapply nowjob description:data engineering leadgartner is looking for an application lead focusing on data engineering this person will be a part of team supporting gartner‘s client facing experience this includes supporting both the gartner com website as well as other engagement applications like email the ideal candidate will be someone who have experience building and maintaining large distributed systems they will have experience capturing storing and processing both structured and unstructured data using different tools and technologies in this position they will be working closely with data analysts and data scientists to gain insight on our client experience as well as build machine learning applications that improve those experiences job requirements:qualifications and technical skills+ university degree in bachelor of engineering or a master’s degree in cs with 6-8 years of experience in software development of which 2-3 years must have been in a lead capacity + experience with different databases technologies (relational nosql graph document key-value time series etc…) this should include building and managing scalable data models + experience building data models infrastructure and etl pipelines for reporting analytics and data science+ experience with big data tools like hadoop emr spark+ experience in developing and consuming web services+ experience with internal search engines (solr elasticsearch etc…)+ experience with cloud based platforms (aws)+ strong knowledge of integration technologies (soa rest xml http etc…)+ strong desire to improve upon their skills in software development frameworks and technologies+ demonstrable experience innovating with technology+ experience working in different programming languages (sql plsql scala java python javascript etc…)+ experience with statistical analysis packages (r python pandas numpy) a plus+ experience with version control tools (git subversion)leadership skills:+ should have significant experience working directly with business users in problem solving+ ability to lead and execute a mix of small medium sized projects simultaneously+ excellent communication and prioritization skills + ability to work collaboratively across the it organization (infrastructure test back office groups) + ability to pick up gartner domain knowledge quickly+ ability to work independently or within a team proactively in a fast-paced agile-scrum environment + owns success – takes responsibility for successful delivery of the solutions operational skills:+ should be able to interact well with both internal associates and external clients in resolving operational issues + must be able to provide accurate estimates of technology work and deliver high quality work on schedule+ identify systemic operational issues and resolve themjob id00019329 seeking a high level data architect for a long term or contract to perm position with our client in westchester ny local candidates only please - apply through dice no calls pleasework with the product and leadership teams to define our enterprise data model and architectureprovide architecture guidance best practices and detailed design to development team for data integration projects across platformsunderstand how data relates to the current products and operations and the effects that any future changes will have on data throughout the ustawork with solutions architect and broader data engineering development team to incorporate feedback into data modelswork with internal team to build process in support of data information lifecycle management governance lineage and qualityqualificationsa ba bs degree in data science or equivalent experience 5+ years of experience in conceptual logical physical data modelingexperience with data modeling design patterns 3nf and dimensional modeling building highly scalable and secured solutionsstrong understanding of cloud architecture specifically sap amazon (ie redshift) as it relates to data processingexperience leading and architecting enterprise wide initiatives specifically system integration data lakes data warehouse etc able to confidently express the benefits and constraints of technology solutions to technology partners stakeholders team members and senior levels of managementunderstanding of pii standards processes and security protocolsfamiliar with data anonymization concepts and technologies preferred - provided by dice data archtect ***please note application rules policy guidelines at bottom of posting before applying to ensure your application can be processed and considered important----you must include a full and formal resume----using your linkedin profile as an option rather than a resume will not allow us to provide you full consideration please do not email us directly and avoid this site as there are required applications questions that are you must complete for consideration duties: you will be responsible for: (1) performing as an advanced software engineering subject matter expert (sme) and advanced technical knowledge resource and critical asset focused on designing engineering implementing deploying and supporting global state of the art enterprise grade big data industry leading applications and solutions; (2) mentoring less experienced development engineering staff and mentoring coaching and providing advanced knowledge transfer; (3) identifying and monitoring metrics around the code development and deployment process; (4) leading global agile development initiatives; (5) leading significant and advanced global big data and cloud based initiatives; (6) leading the successful adaptation of continuous integration (ci) and continuous pipeline (cp) automated software delivery process; (7) identifying opportunity for process and performance improvement with an eye on consistent and proactive “raising of the bar” on development standards policies procedures methodology and quality; (8) continuously collaborating in the introduction of development and engineering best practices as well as evangelizing on emerging technologies and industry direction to impact the future of technology and strategy selection implementation adaptation and life-cycle management; (9) global vendor management required:must have 7+ years strong current software engineering with 4+ years of strong current data sql development in a unix or linux environment with a strong experience with data science toolsets (r python etc ) must have 3+ years of strong current streaming and batch big data technologies (hadoop emr kafka map reduce spark flink etc ) and cloud-based technologies (aws azure google etc ) must have 3+ years of strong current scripting (perl shell python etc ) experience and 3+ years of strong current deployment and configuration tools (chef docker etc ) must have strong and current or recent java development skills experience must have completed a verifiable bs undergraduate degree in computer science or computer engineering or a similar degree pluses: informatica hive pig ec2 s3 ebs capital markets low latency algorithmic trading high frequency trading (hft) fix c++ c data visualization tableau business objects qlik nosql mongodb cassandra hbase consulting background application process and criteria:***only applicants currently residing within the us can be considered applicants from outside of the us cannot be considered for this role ***this is a full-time w2 corporate it employment opportunity with salary bonus benefits this is not a consulting nor contracting role nor corp to corp (c2c) role nor can resumes from consulting firms be considered ****resumes with complete full contact information including full name full residential address phone number and email address can only be considered in order to fulfill federal eoe and audit compliance policy and process resumes without this required information will not be considered ****linkedin "resumes" cannot be considered as they lack the necessary contact information and do not provide the necessary substance for evaluation and processing applicants who use the linkedin version of a resume will not be able to be considered for employment for the role they apply for ****the online questionnaire associated with this role must be completed fully applicants who do not complete this questionnaire cannot be considered software guidance & assistance inc (sga) is searching for a big data developer for a contract assignment with one of our premier financial services clients in san francisco ca as developer within the big data team you will contribute to high quality technology solutions that address business needs by developing data applications for the customer business lines you will contribute to the development and ongoing maintenance of a number of strategic data initiatives and data and analytic applications the ability to communicate effectively is required as you will work closely with other groups including development and testing efforts of your assigned application components to ensure the successful delivery of the project responsibilities : hands on development role focused on creating big data and analytics solutions coding of mission critical components analyze business and functional requirements and contribute to overall solution participate in design reviews provide input to the design recommendations participate in project planning sessions with project managers business analysts and team members experience with implementing metadata management system on big data platform hands-on expertise with graph databases using hbase for storage required skills : experience with middle-tier backend systems development in java linux minimum 3-5 years working experience with hadoop in an enterprise setting experience with java enterprise development python and scalar hands-on expertise with sql & nosql data platforms hands-on expertise with big data technologies (hbase hive sqoop) experience with pub sub messaging (jms kafka etc ) stream processing (storm spark streaming etc ) understanding and application of security best practices as they relate to big data technologies experience with implementing metadata management system on big data platform hands-on expertise with graph databases using hbase for storage experience with horizontally scalable and highly available system design and implementation with focus on performance and resiliency experience profiling debugging and performance tuning complex distributed systems experience with unix shell scripts and commands experience with data modeling ability to clearly document solution designs agile scrum methodology experience experience with etl elt tools - experience with bi solutions (tableau microstrategy d3 etc) works on complex issues where analysis of situations and data requires an in-depth evaluation of variable factors exercises judgment in selecting methods techniques and evaluation criteria for obtaining results acts independently to determine methods and procedures on new assignments and may provide work direction to others works under minimal supervision minimum education training certification bachelor's degree in an information technology area of study preferred skills : master's degree preferred specializing in computer science information management data science or equivalent combination of education and experience sga is a certified women's business enterprise (wbe) celebrating over thirty years of service to our national client base for both permanent placement and consulting opportunities for consulting positions we offer a variety of benefit options including but not limited to health & dental insurance paid vacation timely payment via direct deposit sga accepts transfers of h1 sponsorship for most contracting roles we are unable to sponsor for right-to-hire fulltime or government roles all parties authorized to work in the us are encouraged to apply for all roles only those authorized to work for government entities will be considered for government roles please inquire about our referral program if you would like to submit a candidate for any of our open or future job opportunities sga is an eeo employer we encourage veterans to apply to view all of our available job postings and or to learn more about sga please visit us online at www sgainc com - provided by dice agile analysis consulting data modeling developer development hadoop http inquire java linux management microstrategy modeling nosql project python scripts scrum security shell scripts sql supervision testing unix company descriptionnulljob descriptionour minimum requirements are:masters degree in computer science or equivalent4+ years of experience in building large scale high performance high availability systems and strong computer science fundamentals (algorithms data structures)experience with big data technologies (spark hdfs hbase cloudera mapr hadoop and other frameworks in hadoop ecosystem)kafka oozie workflow elasticsearch etc working experience with microservices container and streaming technologiesfluency in java programming language and familiar with one or more of the following programming languages: scala pythonextensive experience solving analytical problems using quantitative approaches operations research and optimization algorithmscomfort manipulating and analyzing complex high-volume high dimensionality data from varying sourcespassion for answering hard questions with dataexcellent written and oral communication skills able to communicate with all levelsqualificationsnulladditional informationall your information will be kept confidential according to eeo guidelines company descriptionred alpha is a fast growing high-tech government contractor providing exceptional consulting and engineering services based in hanover maryland red alpha is committed to our clients employees and our mission our team consists of dedicated self-starters with the unique ability to succeed our company is employee-focused with superior benefits involved leadership and a proven history of success red alpha seeks self-motivated and passionate professionals because results follow hard work and passion we believe that our staff is our greatest asset because they are both our implementers and facilitators and as such we heavily invest in them so that they reach their career goals while satisfying their penchant to continuously grow join us as we apply our skills to tackle some of the toughest most interesting and rewarding challenges in the intelligence sector red alpha wants you to become a leader in the industry and will work with you to enable our mutual success we are rapidly growing and continuously searching for the best and brightest talent to add to our prestigious group of engineers we have many immediate openings based in the hanover maryland area and in northern virginia intrigued? please apply today! we would love to discuss your career goals to see how we can partner together to achieve them!red alpha specialties: cloud computing & administration enterprise java web application development hpc system administration data science and much more job description• 8 to 10 years of experience in sap basis with 5+ years of experience in sap os-db migrations• strong knowledge of sap hana migrations• minimum 5 sap upgrade and migration project experience in sap business suite (sap ecc crm scm srm…etc ) • should have done at least 2 sap ecc on hana migrations• must have worked on at least two sap hana migration projects using sum dmo tool [combined upgrade and migration] • experience in migrating large database [db size 15tb] • experience in sap hana revision upgrades and performance tuning • sap solution manager skills (technical) - mopz managed system configuration sap hana system monitoring • sap system sizing hana landscape architecture design and setup • installation & configuration of sap systems based on sap hana • should have experience in setting up sap fiori landscape and fiori configuration for sap ecc on hana• high availability (ha) and disaster recovery (dr) setup in sap ecc on hana• excellent communication and team player skills along with customer facing experience • managing co-ordination between onshore and offshore team • managing the partners and working groups engaged in project work • working closely with other vendor and partner to ensure the project meets business needsqualificationsnulladditional informationred alpha is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race color religion sex national origin disability status protected veteran status or any other characteristic protected by law synechron is looking for data architect for our insurance client in basking ridge nj below are the requirement details:- data architectlocation: basking ridge nj job details - prior experience in insurance reinsurance domain is a big plus able to analyze system requirements and implement migration methods for existing data hands-on experience with sql is required familiarity with any reporting and data visualization tools is a plus develop database solutions to store and retrieve company informationable to analyze structural requirements for new software and applicationsdesign conceptual and logical data models and flowchartsable to define optimization techniquesproven work experience as a data architect data scientist data analyst or similar rolein-depth understanding of database structure principles about synechron synechron one of the fastest-growing digital business consulting & technology services providers is a $500 million firm based in new york since inception in 2001 synechron has been on a steep growth trajectory with 8000+ professionals operating in 18 countries across the world with presence across the usa canada uk europe asia and the middle east please do visit our website: http: www synechron com - provided by dice sql migration data modeling senior director big datadirect hire permanent ($170 000 - $220 000 salary or more based on experience)coral springs fldo you have the ability to grow and scale a new data-driven team? are you a lover of data interested in predictive modeling and machine learning? if so this opportunity is for you! here are the details:reporting directly to the ceo the senior director big data will lead a traditional it department as they move toward a more cloud-based data-driven organization day to day responsibilities may include:guide and direct a traditional it team as they move toward a big data cloud-based environmentcreate a vision for big data adoption and align project requirements with the visionpartner with the ceo and various department heads to develop a data strategypromote implementation of a data lake as well as data-driven thinking throughout the enterpriseprovide thought leadership assess new technologies make recommendations and make hiring decisions based on business needsthrough training and tool sharing help analysts meet the company's need for predictive modeling and machine learningwhat we are looking for:bachelor's degree or higher in computer science computer engineering or similarexpert-level experience with data architectureextensive experience with cloud computing platforms such as awsextensive experience leading a data-focused team as a manager or directorexpert-level experience with big data technologies (hadoop spark mongodb etc )experience building data lakesstrong api architecture experiencestrong experience with data analytics and or data science technologies and concepts such as r python and predictive modelingwhat we can offer you:in addition to competitive salaries and growth opportunities employees can take advantage of the following benefits programs: medical dental and vision care coveragepto vacation and holidays401k plus company matchabout strategic it staffing with over 20 years of staffing experience in the technical information field strategic it staffing knows the industry well from the most rewarding jobs available to the hottest career choices to make in addition we are owned and operated by industry experts who have worked in the information arena for over two decades when you choose to work through strategic it staffing you choose a team approach to employment we match your career choice and skills with an employer who needs your knowledge and expertise we make sure you are in control of your career strategic it staffing is an equal opportunity employer all qualified applicants will receive consideration for employment without regard to race color religion gender gender identity or expression sexual orientation national origin genetics disability age or veteran status - provided by dice aws hadoop mapreduce spark python r big data nosql job familyrisk management (finance product)job description summarytransamerica asset liability management is hiring “director of data and reporting” to report to the senior director of risk management - asset liability management (alm) & hedging the director will lead the data and reporting analytics group at enterprise risk systems a business side quant analytics and software development group in this role the director is responsible for designing and implementing data models tools processes governance for all analytic systems used at alm the role requires close coordination with external organizations such as transamerica technology data & analytics and data governance the director plays a key role in ensuring that controls and data integrity match the highest regulatory and internal control expectations while at the same time fostering innovation with new technologies enabling data science and contributing into turning transamerica into data driven organization job descriptionresponsibilities+ lead cfo alm enterprise risk systems data and reporting analytics team + conceptualize design and assist in the implementation of new and existing systems+ master data management metadata data quality data governance and data architecture + continually increase business acumen and awareness of technology best practices to help the team deliver business solutions + supports process framework governance standards audit controls architecture and financial management + coordinates vendor engagement and internal it engagement + communicates effectively with project sponsors and stakeholders of project status progress risks issues and expected outcomes + continually work towards establishing a single reporting repository to reduce data duplication and quality issues + establish self-service analytic capabilities for business functions + devise document and implement conceptual and quantitative models to solve business problems + gather pertinent information and data sources across disciplines to formulate solutions + architect design and implement software using an agile approach + coordinate with the it team the adoption of systems in the production environment under sdlc guidelines + is responsible for producing and presenting departmental level analysis to mid-level management + develop and maintain subject matter expertise required to advise businesses management + this role may include work in alm modeling research trading hedging strategies + the role requires design and implementation of software required qualifications+ ba bs in mathematics actuarial science finance business or related field with 10 years of relevant work experience + or master’s degree in mathematics actuarial science finance or related field and at least 8 years of experience preferred qualifications+ communication skills to convey complex information to business both verbally and in writing at an appropriate level of detail for each audience + previous experience managing enterprise-level projects + developed and delivered data-driven solutions to support business needs and analytics provided recommendations and solutions for improving data accessibility as driven by the business need for knowledge and decision + deep expertise in relational data base and sql+ strong understanding of data warehousing analytics reporting and best practices + assisted a team in the ongoing creation and maintenance of documentation related to data processes and business rules used a variety of data visualization presentation tools to support self-service analytics or to aggregate data for management presentations + make tactical data driven decisions leveraging experience and with consideration to competing priorities consult with end-users and respond with solutions + able to work within a fast-paced environment with quickly changing priorities + able to exercise judgment as it relates to business decisions and their effects on stakeholders + strong time management skills to manage multiple priorities under time constraints + highly organized and detail oriented with the ability to maintain a high level of accuracy + strong presentation and communication skills both written and verbal + strong experience in leading teams responsible for data management and or resolving data issues + demonstrated business acumen and the ability to apply technology solutions to solve business problems + experience in financial services industry + knowledge of equity fixed income credit and derivative instruments + experience with cloud and big data technologies + experience on developing on linux and scripting languages + experience with trading systems + experience with security master timeseries eod marks data models behavioral & leadership competencies+ significant experience and a demonstrated track record with sophisticated companies this role is a management level position our culture:at transamerica we promote a future fit mindset what is a future fit mindset?+ acting as one fosters an environment of positive collaboration+ accountability allows us to own the problem as well as the solution+ agility inspires new ideas innovation and challenges the status quo+ customer centricity encourages an above and beyond approach to our customerwhy work for ustotal rewards at transamerica: it’s more than a paycheck our comprehensive total rewards package is designed to help support you in many ways — throughout all stages of your life and career we provide a competitive market-driven program that encompasses base compensation bonus potential retirement health and wellness benefits learning and development opportunities plus great employee perks all designed with you in mind… to help you live your best life grow personally and professionally - and feel valued for the work you do learn more about our total rewards package equal opportunity employer:transamerica life insurance company is an equal employment opportunity employer and does not discriminate against any applicant or employee because of age religion sex gender identity genetic information race color national origin pregnancy sexual orientation marital status participation in the uniformed services (e g u s armed forces national guard) physical or mental disability or any other status protected by federal state or local equal employment opportunities laws aegon usa realty advisors llc is an equal employment opportunity affirmative action employer and does not discriminate against any applicant or employee because of age religion sex gender identity genetic information race color national origin pregnancy sexual orientation marital status participation in the uniformed services (e g u s armed forces national guard) physical or mental disability or any other status protected by federal state or local equal employment opportunities laws applicants with physical or mental disabilities may be entitled to a form of reasonable accommodation under the americans with disabilities act and certain state and local laws a reasonable accommodation is a change in the way things are normally done which will insure equal employment opportunity without imposing undue hardship on the transamerica companies please contact: applicantsupport@transamerica com if you are a job seeker with a disability or are assisting someone with a disability and require assistance to apply for one of our jobs ontario applicants:our company is committed to providing accessibility to those with disabilities in a manner that is consistent with the principles of independence dignity integration and equality of opportunity that is in compliance with the accessibility for ontarians with disabilities act 2005 ("aoda") please contact applicantsupport@transamerica com if you are a job seeker with a disability or are assisting someone with a disability and require assistance to apply for one of our jobs technical assistance:if you experience technical problems during the application process please email applicantsupport@transamerica com at transamerica hard work innovative thinking and personal accountability are qualities that we honor and reward we understand the potential that is unleashed by leveraging the talents of a diverse workforce we embrace an environment where employees enjoy a balance between their careers families communities and personal interests ultimately we appreciate the uniqueness of a company where talented professionals work collaboratively in a positive environment focused on helping customers secure their long-term financial futures transamerica is a part of aegon an international life insurance pension and asset management company the aegon companies employ approximately 28 000 people and have a strong presence in more than 20 countries across the globe for more information visit www transamerica com attn: west coast based database architect full time remote position with inc 500 saas healthcare software company we are one of the fastest growing companies in america recognized 3x by inc magazine vitalware is a fully funded saas company in the healthcare space we are fast flexible and give you the opportunity to branch out beyond a single task or skill set you will touch different projects and be able to learn and do things that you cant in a larger or more bureaucratic company if you have 5+ years database architect experience and have automated the loading of excel files into mssql have worked with nosql like mongodb or elasticsearch have worked with version control for databases using git or other version control software you go to the top of the list! we want to talk to you and see if it is a fit for both us ps you can work anywhere and our retention rate is out of this world which means people love us and stay! lets talk requirementsqualifications and experience requirementsbs ms or phd in computer science or related technical discipline preferred 5+ years relevant experience proven work experience as a data architect data scientist data analyst or similar rolein-depth understanding of database structure principlesexperience gathering and analyzing system requirementsknowledge of data mining and segmentation techniquesexpertise in sql experience with nosql (mongodb elasticsearch etc )proficiency in ms excelfamiliarity with data visualization tools proven analytical skillsproblem-solving attitudeskill and ability requirementsenthusiastic about sharing knowledge and experience passionate about learning excellent communication written and verbal strong organizational presentation interpersonal and consultative skills a must ability to manage multiple projects tasks simultaneously good judgment and decision-making skills maintains a positive and results-oriented attitude must be well organized accurate and attentive to detail note: this job description is not intended to be all-inclusive employees may perform other related duties as negotiated to meet the ongoing needs of the organization benefitsvitalware offers a full range of benefits to regular full-time employees that support employees and eligible family members including domestic partners and their children these benefits include:medical prescription drug dental and vision coverage 401(k) savings plan long term disability paid time-off and holiday pay as the nations leading bottled water company nestlé waters north america is dedicated to providing customers with healthy hydration options alongside that were also committed to developing our people enabling them to make the most of the many elements that help them to succeed nestlé waters consists of five business units: corporate commercial supply chain technical & production and readyrefresh by nestlé whichever one of these areas you choose to join youll find yourself collaborating with a highly talented team on work thats challenging engaging and incredibly rewarding youll be an essential element of our success: trusted empowered and supported to make a lasting impact on the very future of our business its a chance to use your knowledge skills and experience to shine brightly and achieve your ambitions all while delivering healthy hydration to millions of customers readyrefresh by nestlé is one of the most visible parts of the nestlé waters business delivering healthy hydration to customers where they need it most its another example of how we are committed to helping people maintain a healthy lifestyle youve no doubt seen our trucks on the road on their way to bring our water and tea products to thirsty consumers by joining this fast-growing area of our organization youll have the opportunity to share in our mission with a real sense of ownership and the freedom to succeed in your role its a chance to apply your skills and experience to work thats as challenging as it is rewarding whether collaborating as team to deliver superior customer service or making a lasting impact with your individual accomplishments youll be an essential and valuable element of our success well make sure you receive the support benefits and development you need to build the perfect career as readyrefresh works to transform and modernize our unique business you will have the exciting opportunity to help design and develop the systems that we use to deploy data science solutions at scale you will be joining our growing data and insights team to help drive business decisions we are not just looking for someone to build etl pipelines or a static dataset but rather someone who will help us build the future of data and insights within a mature business the ideal candidate will know how to understand our business model with the lens of data engineering and work to enable the business utilizing data job responsibilities:data strategy: own the creation and evolution of a data strategy and architecture this position is entrepreneurial in nature you will help our team lay the foundations for and bring thought leadership to our data infrastructure build the systems that we use to deploy data science solutions beyond building etl pipelines or a static dataset this role will understand our business model with the lens of data engineering and positively drive results through these systems and data scope data opportunities including quality optimization testing and tooling to improve business results build large-scale batch and real-time data pipelines to scale data science models and business intelligence business acumen: exhaustive ability to validate and crosscheck metrics (by understanding the business) when these metrics don't make sense we need to create and drive a process to define new ways of looking at this data through cross functional debate work in cross-functional and agile teams to continuously experiment iterate and deliver on new objectives data execution: organize business needs into logical models and ensure data structures are designed for flexibility to support scalability of business solutions create data end-points that become new source of intelligence for users #li-ea1qualificationsrequirements and minimum education level:bachelors degree with an emphasis on computer or information science applied mathematics physics or other technical focus4+ years in practical database designminimal to 15% travel requiredexcellent in data representation computer architecture and organizationstrong coding skills using python and sql including standard data libraries such as pandas sql alchemy etc needed experience with big data tools such as hadoop hive spark etcolap development experience (data warehouse concepts cube design fact dimension structure data mart development etc)working knowledge of microsoft bi suite includes ssas ssis & mdx daxfamiliar with the following or equivalent microsoft power bi microsoft azure visual studio with ssdt (sql server data tools) ssmsis able to research experiment and utilize leading big data methodologies such as hadoop spark hdinsight azure data lake analytics and azure data warehouseis able to architect implement and test data processing pipelines and data mining data science algorithms on microsoft azure nestlé waters north america is the nations leading bottled water company our brands include poland spring arrowhead ozarka deer park zephyrhills and ice mountain nestlé pure life s pellegrino perrier acqua panna and nestea were passionate about creating shared value for society in all kinds of ways: from providing careers and benefits to communities where we operate to environmental stewardship most notably responsible water management lightweight packaging and advancing recycling in america as a valuable part of our team youll receive a competitive total rewards package something that will provide you with the support you need to thrive both inside and outside of work its not just the work that youll find fulfilling here though as you build a career with us youll receive exactly the kind of benefits youd expect from a leading name in healthy hydration the only question is what elements will help you succeed at nestlé waters?the nestlé companies are equal employment and affirmative action employers and looking for diversity in qualified candidates for employment description come join intuit as a lead data architect for our proconnect group (pcg) analytics team we are looking for a strong communicator and a problem solver who has deep experience in various big data technologies to build out target state of pcg’s data warehouse our vision is two-fold: first to organize and operationalize all of pcg’s data sets in an intuitive analyst friendly way second is to simplify data access to analysts data scientists and api’s to consume our data can you unleash our analyst horse power? if so this is the perfect opportunity to join a talented group in a start-up like environment working with cutting-edge technologies understanding business needs and translating data requirements will be critical to this role this is a very hands-on player coach role and you must be willing to dig in and code at least 50% of the time pcg is seeking to make a step change with not just organizing data for analysts but also opening up data for all products systems and api’s responsibilities:+ lead the migration of intuit’s consumer data platform to aws+ build a central data warehouse that connects various touch points from sales marketing product care and finance + strong conflict resolution skills to be able to drive clarify in requirements and move the project forward + partner with business analysts and engineering to enable decision support+ develop etl processes to enable data experts to analyze user behaviors and kpi performance+ ensure data collection is optimized to provide crystal-clear visibility into the impact and value of new initiatives and product releases+ be accountable for developing a comprehensive enterprise data architecture including data lake data appliance data warehouse data movement decision science platform etc components + strong understanding data lake approaches industry standards and industry best practices perform hands on development coaching and leadership through all project phases + analyse requirements & business problems in order to provide data solutions structures to deliver solutions that address them+ providing guidance on suitable options designing and creating data pipeline for the analytical solutions data lake data warehouses to specific micro services+ design etl jobs based on jointly defined requirements along the data pipeline+ design and build efficient pipeline using open source tools packages unix shell scriptsqualifications + 10+ years of experience in enterprise data architecture+ 8+ years of experience in conceptual logical physical data modeling+ experience in modeling solutions in aws – redshift (hive knowledge preferred)+ 10+ years of proven expertise in relational and dimensional data modeling+ 5+ years of experience with erwin data modeling tool+ strong understanding of cloud architecture specifically aws as it relates to data processing (i e ec2 s3 redshift etc )+ able to define & maintain bi data warehouse methodologies standards and industry best practices+ experience leading and architecting bu wide initiatives specifically system integration data warehouse build data mart build data lakes etc for a large enterprise+ able to confidently express the benefits and constraints of technology solutions to technology partners stakeholders team members and senior levels of management+ proactive and inquisitive learner seeks out and capitalizes on opportunities for change that enhance the business rather than reacting to circumstances+ demonstrated ability to think end-to-end manage long-term projects and manage multiple tasks simultaneously and deliver on outcomes achieve results+ bias for action high energy “can do” style that thrives in a fast-paced environmentkey supporting competencies:+ creates collaborative relationships+ command skills decision quality+ dealing with ambiguity+ technical leadership and strategic direction for data and analytics teams + ability to accomplish results through others particularly by establishing relationships and being an effective team member or business partner key technology skills:+ enterprise data architecture+ aws+ redshift+ dimensional modeling+ sql tuning+ etl architecture+ data migration+ performance tuningeoe aa m f vet disability senior data manager needed for a long-term contract opportunity with yoh scientific’s client located in berkeley heights nj key responsibilities:• perform a quality check (qc) of all types of edc database discrepancies queries generated for a study to identify and report any data trends database programming errors possible training issues etc • run generate reports from edc database for qc of queries • review and or reference study protocols for understanding of procedures to be collected for the trial • review and or reference data management documents such as data management plan (dmp) and ecrf completion guidelines when applicable to understand the established process and guidelines for case report forms during review of the database queries • reference and follow the applicable sop’s for the process • demonstrate ability to analyze data share trends observations with team members around results and relevant metrics • review spot check data review listings reports for any data trends or findings for a study • may provide cro oversight of the data management functional activities and monitor progress and deliverables for a study • may lead dm study start-up and maintenance activities for a study qualifications:• ba bs in a relevant scientific discipline; minimum five (5) years of experience as a lead data manager in a pharmaceutical cro setting• shows solid interpersonal skills• ability to work on multiple studies projects and manage workload • working knowledge of edc databases preferably medidata rave knowledge of j-review or other reporting tool and basic understanding of sdtm and cdisc • strong written and oral communication skills; ability to work in a team environment with medical personnel clinical monitors statisticians programmers and medical writers • ability to work in a study team environment • intermediate knowledge of oncology hematology therapeutic areas• knowledge of fda ich guidelines and industry standard practices regarding data managementopportunity is calling apply now! senior data engineerrequisition id:42109business unit:corporatelocation:portland or us 97209logistics done differently you are always looking for a better more efficient way to improve productivity at xpo logistics we are looking for people who can effectively incorporate strategies to help people work smarter as the senior data engineer you will be responsible for defining designing and implementing data pipelines data engineering solutions and analytics components to support strategic initiatives and ongoing business processes you will develop solutions that leverage integrate with and expand upon the company's business intelligence (bi) platform technologies and patterns if you’re ready to further your career—to go bigger and better—we have an opportunity for you to grow with xpo pay benefits and more we are eager to attract the best so we offer competitive compensation and a generous benefits package including full health insurance (medical dental and vision) 401(k) life insurance disability and the opportunity to participate in a company incentive plan what you’ll do on a typical day:+ deliver and drive efforts in developing bi data solutions focused on data engineering and analytics using python machine learning rdbms r sql data modeling java unix scripting aws microstrategy power bi and mpp databases+ troubleshoot technical issues in existing processes and current development work solicit assistance from other roles and groups and drive resolution to ensure the integrity of platform quality and project timelines+ understand and improve shared standard patterns templates and artifacts for bi and data warehouse platform architecture data science development approaches data models and new technology adoption and rollout+ collaborate with it and business teams to identify and document use cases performance and capability requirements and criteria for successful solution delivery+ mentor other team members on technical skills best practices problem-solving approaches and standard patterns used at xpo logistics+ proactively generalize and share technical approaches and best practices among other developers and simplify and communicate completed work to broader audiences across the company+ help support data consumers to ensure they have reliable access to trusted data including periodic responsibility for 24 7 on-call production supportwhat you need to succeed at xpo:at a minimum you’ll need:+ bachelor’s degree in computer science mis engineering or related field+ 5 years of experience in data engineering and analytics and delivering business solutions that rely on business intelligence data warehousing data science or similar domainsit’d be great if you also have:+ master’s degree in computer science mis engineering or related field+ data science specialization from coursera udacity data camp etc + r and python certifications; certified business intelligence professional from tdwi; data management professional from dama; certifications in data modeling and data engineering+ 6 years of experience in data engineering and analytics and in delivering solutions in at least two of the following: data science big data r or python and machine learning + experience in handling 1000+ lines of complex sql codes to deliver complex data-based solutions in both development and data analysis work+ knowledge of cloud solutions (preferably in amazon suite of products); data analytics and visualization using r and python (numpy pandas scipy scikit-learn tensorflow and keras)+ familiarity with and working knowledge of dmbok (data management book of knowledge) core concepts (general data management governance architecture development security quality metadata management mdm etc )+ ability to build logical data models physical data models and normalizing techniques with 3nf (using tools like erwin)+ ability to work with mpp databases like netezza redshift or teradata and data pipelines using informatica (power center and power exchange) and python data glue+ ability to analyze summarize and characterize large or small data sets with varying degrees of fidelity or quality and identify and explain any insights or patterns within thembe part of something big #li-ed1xpo provides cutting-edge supply chain solutions to the world's most successful companies including disney pepsi l'oréal toyota and many others we’re the fastest-growing transportation company on the fortune 500 list and we’re just getting started we are proud to be an equal opportunity affirmative action employer qualified applicants will receive consideration for employment without regard to race sex disability veteran or other protected status the above statements are intended to describe the general nature and level of work being performed by people assigned to this classification they are not intended to be construed as an exhaustive list of all responsibilities duties and skills required of personnel so classified all employees may be required to perform duties outside of their normal responsibilities from time to time as needed nearest major market: portland oregon job number: r0014662 booz allen hamilton has been at the forefront of strategy and technology for more than 100 years today the firm provides management and technology consulting and engineering services to leading fortune 500 corporations governments and not-for-profits across the globe booz allen partners with public and private sector clients to solve their most difficult challenges through a combination of consulting analytics mission operations technology systems delivery cybersecurity engineering and innovation expertise data architect seniorkey role:integrate manipulate and manage vast amounts of data into the next generation of big data analytic solutions for our clients combine engineering expertise with innovation to deliver robust solutions that serves our clients and stand apart from our competitors interact with a multi-disciplinary team of analysts data scientists developers and users to understand the data requirements to develop a robust data processing pipeline that will ingest manipulate normalize and expose potentially billions of records per day to support advanced analytics collaborate with and contribute to open source and ensure quality delivery of software through thorough testing and reviews architect build and launch new data models that provide intuitive analytics to our customers and design build and launch extremely efficient and reliable data pipelines to move data in both large and small amounts to our data platform design and develop new systems and tools to enable folks to consume and comprehend data faster and identify new technologies to be injected into the platform to support advanced data integration and analysis basic qualifications: -5+ years of experience with dimensional data modeling and schema design in data warehouses -3+ years of experience with software design implementation and test -2+ years of experience with custom or structured etl design implementation and maintenance -experience in working with either a map reduce or similar system on any size or scale including storage components such as accumulo hbase or hive -experience with batch and streaming frameworks including storm nifi apex or flink -experience with search technologies including solr and elasticsearch -knowledge of restful services design development and testing including developing service apis for external consumption -ability to quickly learn technical concepts and communicate with multiple functional groups -ts sci clearance with a polygraph-bs degree additional qualifications: -experience with multiple data modeling concepts including xml or json -experience with rdbms data stores including oracle or mysql -experience with large-scale distributed systems design and development for scaling performance and scheduling-experience with machine learning and deep learning concepts and algorithms -experience with devops methods and tools including jenkins git svn docker or vagrant-knowledge of at least one scripting language including python node ruby or bash -knowledge of system architecture including process memory and storage and networking management preferred -possession of excellent analytical and problem-solving skills -ms degree in cs or a related field-security+ or cissp certification preferred clearance: applicants selected will be subject to a security investigation and may need to meet eligibility requirements for access to classified information; ts sci clearance with polygraph is required integrating a full range of consulting capabilities booz allen is the one firm that helps clients solve their toughest problems by their side to help them achieve their missions booz allen is committed to delivering results that endure we are proud of our diverse environment eoe m f disability vet dateposted: 20010101 department: design engineering - provided by dice algorithms analysis architecture bash cissp consulting data modeling development git jenkins management modeling mysql mysql networking oracle python ruby scheduling security solr svn system architecture testing xml client in the healthcare space is expanding a department dedicated to data analytics consultant must be senior level with 3-4 years of experience in field consultant will work with a group of security researchers and data scientists to analyze unique context-rich endpoint and network traffic data collected through client's cloud platform consultant will be tasked to apply advanced machine learning algorithms to classify iot devices and detect anomalous device behavior and be responsible for the overall quality of forescout's iot security intelligence content for inquiries please reach out to austin mcquay@rht com**technology doesn't change the world people do **as a technology staffing firm we can't think of a more fitting mantra we're extreme believers in technology and the incredible things it can do but we know that behind every smart piece of software every powerful processor and every brilliant line of code is an even more brilliant person **leader among it staffing agencies**the intersection of technology and people — it's where we live backed by more than 65 years of experience robert half technology is a leader among it staffing agencies whether you'[re looking to hire experienced technology talent or find the best technology jobs we are your it expert to call we understand not only the art of matching people but also the science of technology we use a proprietary matching tool that helps our staffing professionals connect just the right person to just the right job and our network of industry connections and strategic partners remains unmatched apply for this job now or contact our branch office at 888 674 2094 to learn more about this position all applicants applying for u s job openings must be authorized to work in the united states robert half will consider qualified applicants with criminal histories in a manner consistent with the requirements of the san francisco fair chance ordinance © 2018 robert half technology an equal opportunity employer m f disability veterans by clicking 'apply now' you are agreeing to robert half terms of use *req id:* 00410-0010338662*functional role:* network engineer*country:* usa*state:* il*city:* schaumburg*postal code:* 60195-5161*compensation:* doe*requirements:* python ruby on rails internet of things (iot) machine learning data mining information - network security **job requisition number:** 65791bloomberg enterprise solutions \(ed\) is experiencing a prolonged period of unprecedented growth\ our data and technology solutions focus upon the acquisition organisation and distribution of data around financial firms\ product management team:in this role you will be working with the bloomberg enterprise data product management team\ we are constantly improving our products to make them more feature rich and application ready whilst reducing the total cost of ownership\ product managers have a high level of business communication and technology skills so that they can create working prototypes specify engineering tasks and work effectively with sales and clients\ the role:we wish to enhance our data and technology solutions by hiring an experienced financial data architect who is familiar with specifying and using large structured data models and data sets to solve complex business demands and meet regulatory requirements\ in this exciting role you will design specify and create prototypes of large and varied data models and data sets both financial and non\-financial using a wide range of tools methods and platforms\ we expect you to evaluate areas of interest understand what is possible and commercially viable recognise which models and concepts should be applied and how the data resources should be acquired\ an open creative approach is critical to your success\ you will design data for diverse addressable markets explore and work with a wide range of data and apply existing methods or develop new methods\ you will also engage in data analysis in a practical way convince business leaders that your results are worth investing in and educate other analysts and business team members\ most critically you will deliver the output of your data design to business users \(both non\-technical and technical\)\ you will be a key person in shaping the product offering and the customer experience\ what’s in it for you:\- an influential business role within a fast growing area+ broad scope to design and specify commercially attractive products+ the opportunity to grow your business and technology skillswe’ll trust you to:\- understand addressable markets+ evaluate what is possible and viable+ take responsibility for the data architecture of multiple data sets+ work with engineering sales and clientsyou need to have:\- bachelor or masters degree in mathematics computer science or related discipline+ experience with large diverse data sets and the tools to handle them+ knowledge of large structured data models such as fix and isda fpml+ experience in working with global teams across sales technology and operations+ the ability to address multiple priorities in an extremely fast\-paced environmentwe’d love to see:\- knowledge of market data and reference data+ programming skills in languages such as “r” python and scala+ proficiency in data science tools and methods+ track record of using agile and kanban methodologies+ a working knowledge of financial markets securities and derivatives in all major liquid asset classes### if this sounds like you:apply if you think we're a good match\! we'll get in touch to let you know what the next steps are but in the meantime feel free to have a look at this:https: www\ bloomberg\ com professional solution data\-and\-contentwe are an equal opportunity employer and value diversity at our company\ we do not discriminate on the basis of race religion color national origin gender sexual orientation age marital status veteran status or disability status\ software developer - big datariversand technologies is a master data management (mdm) visionary and a product information management (pim) leader we are a team of passionate people who are rethinking the way mdm and pim work we have recently raised $35 million in series a funding over the next two years we will be on a trajectory for an accelerated product innovation and growth if you are technical lead with prior experience working in diverse projects and looking to expand your software development knowledge and advance their career in data management and analytics then now is the right time to join riversand the applications you will be developing will power enterprises worldwide in a variety of industries including retail manufacturing distribution energy healthcare and food services as a software developer - big data you will provide superior technical implementation expertise on your project and improve the experience our clients have with riversand solutionshere are some of your responsibilities we would like to know what else you can add to this you need to have a passion for technology an eye for detail and take pride to master the use cases of your module in the productyou are delivery focused and would deliver as estimated with the quality expectedyou will build reusable code and libraries and be willing to refactor continuouslyyou will develop review and optimize the applications for performance and scalabilityyou will develop software in collaboration with other developers and product management if what you read so far excites you about joining us then we would like you to be already equipped with the following qualificationsbachelor's master's degree in computer science engineering or a related field (or equivalent experience)worked on any one or more technologies for data management use case [not analytics] like elastic search apache solr apache kafka apache storm apache hbase apache hadoop apache spark and apache cassandraminimum of 2 years in java developmentstrong oo programming and oo design concepts knowledgeexperience developing multi-threaded applicationsstrong unit integration testing experiencestrong analytical and logical skills including troubleshootingprior experience developing and shipping a shrink-wrap or saas product a plusopen source contributions a plusknowledge of aws azure and docker a pluswhat's in it for you?we foster a collaborative work environment you will enjoy learning and sharing with other creative and analytical mindswe provide an opportunity for you to experiment and fail fastwe want to make sure you get a competitive compensation and benefitsriversand client roster features high profile enterprises which will provide you with industry-specific insights into data management and analysisbeyond work we compete at local 5ks 10ks and have fun at various sporting eventswe are an equal opportunity employer (eoe) individuals seeking employment at riversand technologies are considered without regards to race color religion national origin age sex marital status ancestry physical or mental disability veteran status or sexual orientation 12-15 years of it experience with several years in hands on data architecture modeling and strategy; with majority of it earned in building enterprise level platforms provide technical design leadership with the responsibility to ensure the efficient use of resources the selection of appropriate technology and use of appropriate design methodologiesmust have extensive industry experience in data modeling best practices and standards translate client business problems into technical approaches that yield actionable results across multiple diverse domains; communicate results and educate others through designdevelop database solutions by designing proposed system; defining database physical structure and functional capabilities security back-up and recovery specifications experience with various forms of data design such as oltp olap ods edw data marts dss and big dataexperience with multiple data warehousing methodologies and modeling techniques such as relational dimensional hierarchical and graphworking knowledge of agile development of micro-service event driven architecture and lambda architectureunderstanding of database performance and tuning having experience in fine-tuning complex databasesdemonstrable experience with issue detection and resolution; be ready to provide examplesworking knowledge of backup and recovery proceduresability to provide guidance to data scientists engineers and other team membersdocumentation skills for processes and proceduresexperience with any public cloud services like aws azure or google is plusexperience with nosql databases such as cassandra dynamodb and mongodb is a big plusknowledge of data sciences ml and ai will be a plusexcellent oral and written communication skills - provided by dice oltp olap ods edw data marts dss and big data job description:**role summary purpose:**the candidate will be responsible for defining the data lake system design and data pipeline interfaces (real-time batch on-demand streaming) the candidate will have demonstrated technical delivery experience using the latest big data technologies and products for large advanced analyst communities including hands on evaluations and in-depth research to ensure solid investment roadmap the candidate must have a solid understanding of hadoop and big data open source solutions such as spark kafka hive pig hbase and elasticsearch experience with system management tools and system usage and optimization tools a plus the candidate will also have experience building relationships and influencing cross functional teams across a large organization **essential responsibilities:**+ develop an effective coherent reliable and phased enterprise data lake architecture approach to help the business grow and change+ develop a roadmap for the enterprise data lake platforms for advanced analytics and data science+ map business opportunities to appropriate data lake architecture patterns as business strategies and technology mature + develop and maintain processes to acquire analyze store cleanse and transform large datasets using tools like spark kafka sqoop hive nifi hbase and minifi+ provide recommendations technical direction and leadership for the selection and incorporation of new technologies into the hadoop ecosystem+ contribute to the development review and maintenance of requirements documents technical design documents and functional specifications+ help design innovative customer-centric solutions based on deep knowledge of large-scale data-driven technology and the financial services industry+ help develop and maintain enterprise data standards best practices security policies and governance processes for the hadoop ecosystem+ perform other duties and or special projects as assigned**qualifications requirements:**+ bachelor's degree with minimum 4 years of experience in information technology or in lieu of the bachelor's degree minimum 7 years of experience in information technology+ 4 years experience as an architect with hands on working experience on hadoop spark kafka mapreduce hdfs hive pig sqoop and oozie+ experience in ai bi and machine learning projects+ experience working on aws azure or other cloud providers for big data**desired characteristics:**+ credit card payment experience+ strong background in financial services+ extensive experience working with data warehouses and big data platforms+ experience working in real-time data ingestion+ experienced in sourcing and processing structured semi-structured and unstructured data+ experience in working no-sql data stores such as hbase cassandra hawq db etc + experience in data cleansing transformation performance tuning + experience in leveraging apache atlas for data governance + experience in storm kafka and flume would be a plus + experience in java and spring would be a plus + experience working on ab initio would be a plus + hortonworks or cloudera or mapr certification would be a plus+ basic knowledge of big data administration (ambari) + demonstrated experience building strong relationships with senior leaders+ strong leadership and influencing skills+ outstanding written and verbal skills and the ability to influence and motivate teams**eligibility requirements:**+ you must be 18 years or older+ you must have a high school diploma or equivalent+ you must be willing to take a drug test submit to a background investigation and submit fingerprints as part of the selection process+ you must be able to satisfy the requirements of section 19 of the federal deposit insurance act + if currently a synchrony financial employee you must have been in your current position for at least 6 months (level 4 – 7) or 24 months (level 8 or greater) have at least a "consistently meets expectations" performance rating and have the approval of your manager to post (or the approval of your manager and hr to apply if you don't meet the time-in-job or performance requirementlegal authorization to work in the u s is required we will not sponsor individuals for employment visas now or in the future for this job opening all qualified applicants will receive consideration for employment without regard to race color religion sex sexual orientation gender identity national origin disability or veteran status **reasonable accommodation notice:**+ federal law requires employers to provide reasonable accommodation to qualified individuals with disabilities please tell us if you require a reasonable accommodation to apply for a job or to perform your job examples of reasonable accommodation include making a change to the application process or work procedures providing documents in an alternate format using a sign language interpreter or using specialized equipment + if you need special accommodations please call our career support line so that we can discuss your specific situation we can be reached at 1-866-301-5627 representatives are available from 8am – 5pm monday to friday central standard time **grade level: 13**job family group:information technologywith roots in consumer finance that trace back to 1932 synchrony financial is a leader in consumer credit and promotional financing providing a range of products for a diverse group of national and regional retailers; including main street mainstays local merchants manufacturers buying groups industry associations and healthcare service providers we are the largest provider of private label credit cards in the united states based on purchase volume and receivables and we provide co-branded dual card credit cards promotional financing and installment lending loyalty programs and fdic-insured savings products through synchrony bank who do we serve? hundreds of thousands of customers across the u s and canada spanning the electronics and appliances home furnishings automotive power products and sports jewelry and luxury retail and healthcare industries our purpose is clear: we are committed to pioneering the future of financing improving the success of every business we serve and the quality of each life we touch this is fitting because when you join synchrony financial you’re joining an organization that recognizes that our people are our greatest asset —every single one of them that’s why we are deeply committed to investing in the growth of each member of our team and with 80 years of experience we know how to develop talent at synchrony financial we work hard to offer competitive rewards compensation and benefits when you join us you become part of a stimulating work environment with vast opportunities to sharpen your skills and embrace new leadership challenges application developernote - this role requires an in-person interview in san jose duration: 6 months+ (possible extensions)required skills and experience:bachelors or master’s degree in computer science or equivalent6+ years of experience in building large scale high performance high availability systems and strong computer science fundamentals (algorithms data structures)experience with big data technologies (spark hdfs hbase cloudera mapr hadoop and other frameworks in hadoop ecosystem)kafka oozie workflow elasticsearch etc working experience with microservices container and streaming technologiesfluency in java programming language and familiar with one or more of the following programming languages: scala pythonextensive experience solving analytical problems using quantitative approaches operations research and optimization algorithmscomfort manipulating and analyzing complex high-volume high dimensionality data from varying sourcespassion for answering hard questions with dataexcellent written and oral communication skills able to communicate with all levels company descriptioncommon purpose uncommon opportunity everyone at visa works with one goal in mind – making sure that visa is the best way to pay and be paid for everyone everywhere this is our global vision and the common purpose that unites the entire visa team as a global payments technology company tech is at the heart of what we do: our visanet network processes over 13 000 transactions per second for people and businesses around the world enabling them to use digital currency instead of cash and checks we are also global advocates for financial inclusion working with partners around the world to help those who lack access to financial services join the global economy visa’s sponsorships including the olympics and fifa™ world cup celebrate teamwork diversity and excellence throughout the world if you have a passion to make a difference in the lives of people around the world visa offers an uncommon opportunity to build a strong thriving career visa is fueled by our team of talented employees who continuously raise the bar on delivering the convenience and security of digital currency to people all over the world join our team and find out how visa is everywhere you want to be job descriptionyou will be a leader with demonstrated ability to lead by influence and a track record of successful delivery of large-scale complex systems at an internet or financial services company or as a product at an enterprise software company the ability to synthesize and simplify complex needs such as business capability operational efficiency regulatory security and privacy considerations into architecture and system design is required this includes the ability to work with technical people at all levels and work tightly with program product management prior experience should include architecture and design validated through delivery and operations for high-demand data systems for analytic and operational use examples include real-time data collection systems pipelines warehousing data cleaning and quality business intelligence predictive analytics and real-time scoring systems validated ability to quickly grasp and evaluate new ideas and technologies from internal and external sources and match them with appropriate technology and business problems to yield pragmatic innovation is required experience in driving architecture for complex projects that cut across multiple teams and geographies is required experience with big data or open-source technologies including hadoop are required responsibilities- develop systems and component architectures and apis that meet the test of time articulate and evangelize architectural principles reciprocally with engineering teams that ensure that system components fit and last and align with company’s business direction - validated ability in leading a small team of senior architects and multi-functional architects and engineers - lead a team of architects to define develop maintain and communicate the technology and platform strategy to all levels including the visa executive team- work with engineering teams to provide continuous architecture and design mentorship leadership and be a source of support that ensures successful product delivery and operational excellence in production - collaborate with key partners such as developers development managers product and program management and senior technical and business executives to drive the architecture strategy - resolve approaches for new areas by quickly investigating the state of the art and available technologies - take a consultative approach to develop present and explain and evangelize the value and vision of proposed architectures and solutions to a wide audience- promote architecture standard methodologies and mentor key technical people within the data product organization - lead and support a team to design build and operate an uncommon data platform- champion a culture of innovation in an environment that requires high levels of scalability security and reliability- establish relationships with key architects across technology organizations and collaborate on promoting architectural standard methodologies - chip in to building visa’s technology brand through internal and external initiatives qualifications• bachelor's degree in computer science or related discipline from a top institution required phd in computer science highly desirable • 10+ years experience at least 5 in an architect position required • experience managing and leading teams across geographies• technical background in data with deep understanding of issues in multiple areas such as data management query processing distributed processing high availability statistical and machine learning and operational excellence of production systems required • outstanding verbal written presentation facilitation and interaction skills including ability to effectively communicate architectural issues and concepts to technical and non-technical people at multiple organization levels required • experience in m&a (evaluation integration etc )• experience in payments or related industry is a plus additional informationvisa will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of article 49 of the san francisco police code all your information will be kept confidential according to eeo guidelines are you ready to revolutionize entertainment? gracenote is an entertainment data and technology provider powering the world’s top music services automakers cable and satellite operators and consumer electronics companies at its core gracenote helps people find discover and connect with the entertainment they love daily gracenote processes 35 billion rows of data and is quickly becoming a world-leader in return path “big data ” over the past 3 years the company has grown to more than 2000 employees in 17 countries including over 600 of the world’s top engineers with a passion for music video sports and entertainment technology founded in 1998 gracenote is one of america’s most iconic and respected media companies we are presently looking for a senior software engineer in our big data team to help ingest and process terabytes of events build profiles and provide a personalized media experience min required skills:either one of big data processing frameworks such as apache spark (preferred) mapreduce mapr twitter storm pyspark etc with good overall development skills in any object oriented programming languageor very strong java development skills with willingness to learn and work with apache spark and javaif you are an experienced software engineer dreaming about building a big data profiling and personalization platform at scale on a public cloud and learning how to derive insights from raw data using careful processing techniques please read on!we are building next generation company-wide platform for storing analyzing and profiling user events arriving at the rate of a twitter fire hose due to the company’s unique position we are able to try our technologies on large volumes from the start and keep at the forefront of innovation of how data is being analyzed and monetized we are using the latest technologies running on open source big data platforms and deployed on a public cloud team is comprised of engineers with experience building profiles analytics and big data platforms for leading web consumer and open source companies in the bay area top reasons to work for us:significant opportunity to work with all pieces of big data technology stackreal influence on entertainment experience of hundreds of millions of people all around the worldopen culture where fun collaboration and happiness is keyprofitable company plus stability and security of a larger parent companyfun activities – we’re in media and entertainment businesswork environment that promotes work life balancewhat you will be doing:develop a processing and analytics platform for terabytes of user events using a variety of tools such as spark spark streaming kafka hadoop and elasticsearchbuild smarttv and video consumption profilesput into production complex analytics algorithms that run 24 7 in various parts of the world on public cloudbuild large-scale content graphs from video and music metadataoperating and scaling our infrastructure in awswe will consider applicants with either one of the following skill sets:required skills option 1 (apache spark with good java or scala):· 5+ years experience in java or scala development· 2+ years development with apache spark or equivalent big data processing framework (mapreduce twitter storm mapr pyspark etc)· understanding of data flows data architecture etl and processing of structured and unstructured data· ability to work as a part of a team and self-motivated· bs in computer science mathematics or equivalent work experience master's preferredrequired skills option 2 (very strong java or scala):· 8+ years experience in java or scala development (java certifications are nice to have)· willingness to learn and work with apache spark· understanding of data flows data architecture etl and processing of structured and unstructured data· ability to work as a part of a team and self-motivated· bs in computer science mathematics or equivalent work experience master's preferrednice to have skills:· python ruby shell scripting· elasticsearch kibana· s3 ec2 and variety of amazon web services aws technologies· familiar with data mining concepts machine learning algorithms and basic statisticsour passion for music tv movies and sports is at the heart of everything we do but what really makes us tick is our people from emeryville to sydney and queensbury to amsterdam we are building the team that’s going to disrupt the digital universe this starts by creating a workplace where all things entertainment are celebrated and innovation can come from anyone if you are interested in being mission critical and on the leading edge of global entertainment technology then please contact us today!gracenote a nielsen company is committed to hiring and retaining a diverse workforce we are proud to be an equal opportunity affirmative action-employer making decisions without regard to race color religion gender gender identity or expression sexual orientation national origin genetics disability status age marital status protected veteran status or any other protected class the big data software engineer is responsible for the design and development of high performance distributed computing platforms using big data technologies such as hadoop nosql other distributed environment technologies this position will work closely with the big data platform architects to evolve and enhance the at&t big data platform evaluate new technologies help to define big data standards and ensure the platforms end to end scalability stability and manageability it will also analyze design program debug and modify new data products used in distributed large scale analytics and visualization solutions position will interact with data scientists and industry experts to understand how the hadoop platform can fulfill evolving requirements of the at&t universal data hub specifically this big data software engineer provides the backbone (the data and data platform) for the team as well as for the enterprise this position helps to drive platform optimizations and drive smart business decisions across at&t additional responsibilities include the following:+ large-scale systems software design and development experience with experience in unix linux + work closely with the platform architecture team to help drive the definition and enhancements of the at&t bigdata platform architecture and technology stack+ lead feasibility analysis select the technologies that provide the best solution and identify the products available that will best fit the solution proposed responsible for vendor and technology evaluations constantly looking for the right tools to support key initiatives+ expert level competency with scala java spark map reduce high performance tuning and troubleshooting of highly distributed systems in addition the candidate must also have the ability to mentor and develop others in these technologies + able to benchmark systems analyze system bottlenecks and propose solutions to eliminate them;+ strong communication and presentation skills are required to effectively convey relevant insights to the teams + exercises judgment on how to effectively communicate highly technical and complex details through the use of visualization and careful selection of "knowns" versus "hypotheticals" + familiarity with jvm-based languages including scala and java; python and r; hadoop family languages including hive pig ; high performance data libraries including spark numpy tensorflow or similar + familiarity with hdfs-based computing frameworks including spark and storm are desirable + experience with full stack automation including os build automation hadoop platform deployments platform instrumentation and metric collection reporting and alerting+ experience with modern visualization tools such as grafana or kibana and zepplein notebooks+ plus for a candidate who has in depth experience with hortonworks hdp and hdf llap druid hbase phoenix ams technologies + professional level: bachelors of science in computer science math or scientific computing preferred requires 3-5 years of experience + senior level: bachelors of science in computer science math or scientific computing preferred requires 5-8 years of experience + principal level: masters of science in computer science math or scientific computing preferred experience: requires 8-10 years of experience the position will be filled in plano texas or palo alto ca and requires daily office presence no relocation assistance is provided the manager etl db development is responsible for all activities related to development upgrade and enhancement and support of enterprise data warehouse data marts extract-transform-load (etl) batch processes and other data management and analytics applications the manager provides leadership and guidance for design architecture and implementation of data warehouse and data management and is ultimately accountable for overall solution implementation may assign personnel to various projects and directs their activities monitors project schedules and costs confers with and advises subordinates on administrative policies and procedures technical problems priorities and methods business matters will be treated with the utmost urgency and focus on results and deadlines until each task or project is completed is key the qualified candidate will display excellent problem solving analytical communication coordination and team management skills daily activities will include reviewing current data load processes leading daily planning meetings providing estimates handling ad-hoc requests and tracking and communicating daily progress on projects and initiatives this position will also take on various responsibilities throughout the department as assigned reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions lead the design architecture documentation development deployment and support of data warehouse data marts and etl solutionsprovide architecture and design guidance to the developers; drive resolution for technical issues and provide alternate solutionsassist developers and architects in understanding aca’s processes procedures and workflow; hold knowledge sessions to assure clarity and content of training materials;manage data warehouse and etl project tasks for the team estimate effort schedule tasks and drive project deliverables and schedules; communicate status issues and recommendations with managementwork closely with the business to gather analyze and document the data requirements for projects of medium to high complexity and moderate to high riskdocument and map interaction (source-to-target source-to-etl mapping) between business processes information and data for projects which are of medium to high complexity and moderate to high riskdefine and document standards guidelines and processes for data warehouse etl and database development; ensure that the guidelines and processes are followedcontribute to development and implementation of standards and tools for data management solutions for metadata management data mining data modeling data cleansing transformation and matching data stewardship data quality data integration and data securitycontribute to the creation management and enhancement of enterprise data management standards and governance processescontribute to early-phase concept assessments and oversee data management project estimations to help leaders organize business cases for investment decisionslead the creation and maintenance of reference documentation of the systems including current state architecture standards and processesbring thought leadership and work collaboratively across the technology organization on innovation improvement and efficiency programsoversee quality testing and implementation of data acquisition migration and information deliverykeep up with technology trends in the data management space and lead technology research evaluation and selection activitieshandle multiple projects simultaneously and engage in problem solving on these projectsidentify and recommend technical and process trainings for the team-membersother tasks as assignedbachelor’s degree in information systems business administration computer science or other relative technical degree (equivalent work experience will be considered in lieu of degree)at least 7 years’ experience working in information technology across one or more of the following areas: data warehousing business intelligence data management data modeling data architectureat least 5 years leading enterprise data management or systems integration solutions involving multiple systems3+ years of team management experience preferably in data management domainexcellent relationship management skillsexpertise in tools architecture and standards and processes used for data warehouse and business intelligence solutionsexperience in programming languages (sql) relative to database management systems and an understanding of database modeling and designexcellent communication skills - verbal written and presentation - including the ability to tailor communications based on the audiencestrong analytical and problem-solving skillsunderstanding of application development database and infrastructure capabilities and constraintsability to act quickly and triage in a crisis situation to make decisions and keep key constituents customers informed of the situationability to plan organize manage and track multiple detailed tasks and assignments with frequently changing priorities in a fast-paced work environment; demonstrates a sense of urgency around problemsproven track record of building influential relationships with internal business and technology customers; ability to influence across departmental lines without direct authority and find common ground to achieve architecture objectivesability to think strategically and identify and understand business needs and translate into strategic direction plans and solutionsexcellent knowledge of sql database instance tuning optimizing complex sql statements for best possible performance experiences in dw appliances like netezza teradata and specialized etl tools like informatica will be a strong plusexperience in big data technologies - hadoop hive nosql etc or machine learning technologies is a strong plusexperience in the financial services industry is a strong plus experience working in an environment with npi data masking of npi data and npi data access controls is a strong plus supervisory responsibilitythis position will supervise a team of data warehouse etl and database developers work environment and physical demandsthis job operates in a professional office environment this role routinely uses standard office equipment such as computers phones photocopiers filing cabinets and fax machines position type expected hours of workthis is a full-time position days of work are monday through friday the daily schedule may vary from 8 am to 5 pm or 9 am to 6 pm hours may vary or exceed 40 in any given week depending on the needs of the business depending on system needs and emergencies the manager may be called in on weekends and overnight hours on occasion travelthis position will require up to 10% travel eeo statement aca provides equal employment opportunities (eeo) to all applicants for employment without regard to race color religion gender sexual orientation gender identity or expression national origin age disability genetic information marital status amnesty or status as a covered veteran in accordance with applicable federal state and local laws aca complies with applicable state and local laws governing non-discrimination in employment in every location in which the company has facilities please note this job description is not designed to cover or contain a comprehensive listing of activities duties or responsibilities that are required of the employee for this job duties responsibilities and activities may change at any time with or without notice job category: mid-level data analyst position title: data analyst 3location: washington dc description: support the commodity futures trading commission (cftc) in washington dc the successful candidate will provide ongoing technical support for the maintenance and enhancements to the data standards managed and supported by the data standards team this will include the following tasks data standards support application review rulemaking support data standards maintenance data operations support and transition support clearance level: position of trust suitability requirement: please be aware that this position requires a u s government public trust suitability determination applicants who accept a conditional offer of employment may be subject to government security investigation(s) and must receive a favorable suitability determination for access to u s government information job responsibilities: · work closely with the customer to gather and document project requirements· perform data analysis to learn and understand the data and the question being answered independent research may be required to facilitate understanding · perform data processing to manipulate the data to prepare for reporting including linking the data to other data sources formatting the data supplementing the data with reference data and may require data modelling skills and the use of intermediary tables to store the cleansed data set · create data report and visualizations to present the data in ways that are meaningful actionable and understandable to the customer · test each project thoroughly before presenting to management and executives · create documentation required under pmlc and the configuration change board create user guides for customers · perform risk analysis which may include a broad range of risk measures such as value at risk var evolution over time stress testing conditional var and incremental (and marginal) var calculation for individual asset classes and positions · develop prototype procedures as well as design database tables queries and views in ms sql server · assist business users economists statisticians and others to support effective use of analytic techniques and cftc-approved tools education requirements: bachelor’s degree or master’s degree in computer science mathematics economics statistics or related area of studycertification requirements: n aexperience skills required: · bachelor’s degree or master’s degree in computer science mathematics economics statistics or related area of study· at least 5 years of experience using sas to create and automate reports and data analysis · knowledge of and experience with visualization tools and software sas enterprise guide and sas stored processes sql software and querying and performance tuning with big data travel required: n aphysical requirements: n adesired qualifications: n aif you feel you are qualified for this position please go to http: www salientcrgt com careers to apply salient crgt (salient) is a leading provider of information technology engineering and intelligence analysis services to agencies in the intelligence defense homeland security and cyber domains salient is proud to be an equal employment opportunity aap employer and maintains a drug-free workplace salient prohibits discrimination against employees and qualified applicants for employment on the basis of race color religion sex (including pregnancy) age disability marital status national origin veteran status or any other classification protected by applicable discrimination laws salient also participates in e-verify click here to learn about the e-verify program for more information on salient crgt inc please visit us at www salientcrgt com unable to work with 3rd party agencieslocal candidates onlyunable to sponsor h1bs at this timethe data architect focus will be to lead the data architecture for various large initiatives with logical and physical data models that support the use cases and fit those our overall mdm solution strategy & architecture the data architect works cross-functionally with r&d other architects product management operations and with customers this individual will help identify solution requirements and define the future data architecture for products responsibilities:defines the data architecture comprised of multiple systems that allows the company and its trading community to more easily manage the data content that we have in our systems to include:creating the necessary architecture to allow consolidation create the mechanism for managing the life cycle of the data sources once establishedapplying security measures to allow sharing while protecting private proprietary dataunderstands enterprise architecture scope and issues and creates solutions consistent with enterprise architecture best practices collaborates with development it and product management teams on options for validating architecture components sub-systems develops a set of data layer project plans that identifies:deliverable phasesresource needsidentifies cross-project dependencies and integration issues and plans;risks risk mitigation both within projects and across projectscreates logical and physical data modelsdefines and implements the data governance process required skills:bachelor's degree in computer science computer information systems or other technical degree (engineering math physics etc ) master's degree preferred10 years+ experience managing and architecting data systems in modern data architecturesdemonstrated ability to define data layer solutions and systems using industry best practices (i e methodologies technologies and standards) that meet or exceed expectationsexpert ability in the data architecture data models and data system implementationability to discern user requirements and develop specificationsfamiliarity with information security vulnerabilities including hipaa gdpr and pci standards and risk managementexperience with big data technologiesexcellent written oral communication skills and well-developed interpersonal skillsability to articulate ideas to both technical and non-technical audiencessuperior analytical evaluative and problem-solving abilities with attention to detailexpert knowledge of master data management technologiesexpert level knowledge of service oriented architecture approaches and techniquesexpert level knowledge of enterprise-level and business to business integration techniques and approachesskilled in design techniques for reliability failover and high transaction volume throughputexperience with the aws stack a plus aramark (nyse: armk) is in the customer service business across food facilities and uniforms wherever people work learn recover and play united by a passion to serve our more than 270 000 employees deliver experiences that enrich and nourish the lives of millions of people in 22 countries around the world every day aramark is recognized among the most admired companies by fortune and the world’s most ethical companies by the ethisphere institute learn more at www aramark com or connect with us on facebook and twitter the director data applications & platforms is responsible for the full systems life cycle management of “data“ that supports aramak’s central data and analytics organization the focus of this role is on collaborating with senior leadership from all line of businesses and functions to develop implement and optimize business processes to enable data solutions drive business intelligence and analytics this role also ensures that long-term solution delivery methodologies are followed and that these methodologies deliver quality reliability stability and completeness of all data-related programs while also meeting financial objectives and the needs of the business this position requires exceptional database development data profiling data cleansing and problem-solving abilities and will report directly to the avp of data foundation within aramark’s data and analytics center of excellence position responsibilities experience with software design process considerations throughout stages including requirements architecture design development quality assurance deployment and maintenanceexperience leading architecture transformation from traditional to next-gen data-analytic technology solutionsfamiliarity with big data concepts hadoop ecosystem components and complimentary technologies experience with scripting in languages such as r python perl; or software development in java proficiency with cloud technologies and concepts real-time analytics streaming data platform and machine learning tools experience of informatica power center dataiku and altryx for profile large set of data cleansing and finally storing the cleaned data for advance analytics and reporting manage strategic projects and initiatives as assigned qualificationsbs ba degree in computer science information systems or related field degree is preferred7-10 years of experience in building and implementation of data applications experience in building and maintaining data transformations (etl) using sql or scripting programming languages such as r or python ability to analyze troubleshoot and performance tune sqlhands-on with database performance security and integrityexperience in database design development and data modelingability to identify problems and effectively communicate solutions to peers and managementstrong project planning and management skillsability to multi-task and manage multiple initiatives senior data engineertotal expert is looking for a senior data engineer to expand our software engineering organization we are a saas technology start-up located in eden prairie mn focused on providing the very first fully integrated central hub for marketing sales and collaboration across the real estate and mortgage industries we are looking for team members who are motivated to move fast be innovative and are passionate about delivering a high-quality software platform as part of a collaborative and awesome team!position overview:we are looking for an experienced senior data engineer to join our growing engineering team! in this position you'll work directly with a large team of software qa and devops team members to expand and support the total expert platform and integrations we're looking for someone who excels at writing sql and other automation scripts who has an eye for automation and is passionate about data what you'll be doing:work with development managers and engineers to design develop and manipulate complex data sets and data scripts supporting external integrations and ad-hoc data requests supporting our internal team and customer teams create update and maintain a library of automation data scripts and utility applications supporting our platform and users automate recurring data requests and tasks originating from internal and external customers participate in software scrum projects and communicate status and obstacles to the delivery team minimum qualifications4+ years' experience working in a data orientated role writing complex sql queries on a relational dbms (sql server mysql oracle postgresql etc ) detail orientated with the ability to maintain awareness of priorities and excels at validating accuracy of results from complex queries and datasets experience using python ruby powershell or other scripting languages to automate complex scheduled and repetitive tasks strong technical and analytic skills; ability to understand complex systems processes and data sets experience with version control systems such as svn git or tfs demonstrated history of staying current on best practices and tools within relevant fields of technology education requirements:a bachelor's degree in computer science software engineering information technology or related field qlik sense developer location: atlantarate: open all inc start date: feb 12th feb 19th end date: 3- 6 months ( possibility of extension) qlik sense developer will be part of the team that is designing and building next generation analytics and business intelligence solutions to provide data insights to fortune 500 global organizations the candidate will work in an agile collaborative team that is passionate in providing simple solutions for hard problems using qlik sense and other analytics tools the solution architect's responsibility includes: - understand complex qlik data model and architecture implementation using qlik's best practices and standards - understand first advantage cross business requirements and recommend best architecture solution using qlik sense - ability to design develop and deploy qlik sense analytics - understand the qlik security rules and section access to restrict the data and analytics based on custom roles - understand qlik sense apis and work with team of developers to troubleshoot the api requests and responses - understand the qlik extensions available in community recommend and integrate with first advantage qlik implementation - actively participate and provide inputs and recommendations on the fly during the agile development with a small self-directed team of developers business analyst and data engineers - lead the technical architecture implementations for qlik sense deployments - meet with customers and internal account managers business users to perform live product demo's and proof-of-concepts - partner with business analyst and internal business users to come up with innovative solutions and recommendations qualifications: - minimum of 8 years of strong experience and expertise with sql oracle plsql - minimum of 5+ years of experience working with qlik products (qlik view qlik sense) - strong data analytics and data science skills - strong working experience in a oem product company that delivered qlik sense solutions to variety of customer base would be a plus - experience with analytical programming languages using r python libraries - strong experience in relations database and big data analytics preferred- knowledge in qlik sense performance turning and application optimization - strong experience in qlik sense administration and security rules implementation - knowledge in qlik sense apis extensions and customizations - knowledge of other bi tools like oracle business intelligence sql server reporting cognos would be a plus - candidate must have strong communication skills and creative skills as a patient-focused organization the university of utah health care exists to enhance the health and well-being of people through patient care research and education success in this mission requires a culture of collaboration excellence leadership and respect university of utah health care seeks staff that are committed to the values of compassion collaboration innovation responsibility diversity integrity quality and trust that are integral to our mission eo aathe incumbent leads the architecture design and development of the next generation data warehouse platform and analytic solutions consults with business and technology leaders to understand organization goals serves as a lead architect in defining and implementing new enterprise data warehouse establishes and maintains data modeling standards to promote quality model deliverables performs technical evaluations and proofs-of-concept and present recommendations recommends architectural changes that maximize system performance stability and maintainability designs and builds logical data models creates automated reports dashboards etl workflows and mappings tunes system performance through sql tuning db optimization and other strategies collaborates with junior team members on the technical approach and design for complex problems oversees the architectural direction of the business intelligence platform in order to ensure a cohesive and high performing system knowledge skills abilitieshas significant experience architecting data warehouse solutions within healthcare and or life sciences industries in depth knowledge of technique models processes methodologies and plans within architecture disciplines excellent understanding of healthcare industry standards and technology trends strong database sql etl and data analysis skills experience with data warehouse platform evaluations selection and experience with teradata netezza or greenplum highly desirable excellent communication skills written oral and presentation delivery ability to present complex technical concepts in terms that are relevant and applicable to hms business ability to sell the value of data services to hms in business terms to promote better information and capabilities for better decision making qualificationsrequiredbachelor’s degree in computer science or equivalent five years of applicable experience working conditions and physical demandsemployee must be able to meet the following requirements with or without an accommodation this is a sedentary position that may exert up to 10 pounds and may lift carry push pull or otherwise move objects this position involves sitting most of the time and is not exposed to adverse environmental conditions we are university of utah health healthcare utah edunon indicated this role will be responsible for all lifecycle aspects of the solution development from scoping gathering and documenting requirements development to support and maintenance partners with it and business users to understand the requirements and document translate them into technical designdevelop custom solutions through the use of microsoft sql server ssis ssrs and or microsoft sharepointparticipates in all aspects of application development develops automated applications surrounding data extracts and data cleansing routines leverages the development team to provide 24x7x365 support for business critical applicationstroubleshoot user database issues which include working with the individual end-user groups engineering external clients as well as the technical support organizations of our 3rd party vendors interpret data and work with cross-functional teams to identify and correct technical performance issues participate in defect triage and work with project team vendors to resolve prioritized defect fixesstays abreast of technology advances and participates in the evaluation and implementation of new technologies into the environmentminimum education requirementsbachelor’s degree preferred especially in computer science mathematics information systems or related degreespecial knowledge and or skillsworking knowledge of health care claims systems such as qnxt facets amisys or similar systems or development tools a plusexcellent verbal and written communication skillsstrong analytical and problem solving skills requiredability to be flexible and adapt to changing requirementsability to effectively prioritize and handle multiple tasks and projectswork background experience strong hands-on technical experience (2+ years) in microsoft environment (sql server ssis ssrs ssas tfs) and writing complex queries using sqladvanced knowledge of t-sql stored proceduresphysical requirementsphysical health sufficient to meet the ergonomic standards and demands of the position about usvirginia premier is a managed care organization which began as a full-service medicaid mco in 1995 partnered with vcu medical systems we strive to meet the needs of the underserved and vulnerable populations in virginia by delivering quality driven culturally sensitive and financially viable medicare and medicaid healthcare programs headquartered in richmond va we also have offices in roanoke tidewater and bristol with additional satellite locations allowing us to serve over 200 000 members across eighty counties throughout virginia we offer competitive salaries and a comprehensive benefits package to include excellent medical dental and vision plans tuition assistance infant-at-work program remote work options and generous vacation and sick leave policies our culture supports an environment where employees can continuously learn and gain professional growth through various development programs education exciting projects and career mobility all qualified applicants will receive consideration for employment without regard to age race color religion sex sexual orientation gender identity national origin disability or protected veteran status eoe our mission is to inspire healthy living within the communities we serve! engility is seeking qualified data architects supporting a customer in the intelligence community positions are located thoughout the northern virginia area duties will include but are not limited to:provide data management expertise supporting development of a robust program for shaping enterprise data and metadata processing chains form collection to & from analyst perform modeling of data derived from material provided by sponsor apply mathematical tools to sift through large amounts of data that are likely to contain contradictions apply an understanding of global trends and events (e g economics social issues religious issues) to assist with the formulation of hypotheses for interpretation of data generate modify and test hypotheses that may explain observed results report results of analysis orally and in a variety of written formats suitable for audiences of various backgrounds and needs refine existing methodologies and or develop new models and analytic techniques consistent with the evolution of technologies and practices assist with the testing and development of new systems maintain a level of predictive analytic proficiency necessary to keep up with the state of the art explore ways to apply existing analytical techniques to new data sets employ analytic tools to organize and analyze the sponsor's data and to identify patterns work to develop new tools in support of sponsor's mission priorities coordinate model and tool development with other technology projects within sponsor group provide support and guidance to sponsor for sharing techniques and practices with other government agencies attend technical discussions to support the sponsor's data gathering from external organizations assist with researching and acquiring data from sponsor sources that support the development of models produce presentations reports and metrics as required by sponsor typical duties and responsibilities: active ts sci with polygraph clearance is required experience in data management data storage and data analysis technical knowledge with; big data storage & analytics information science software development and databases systems security process and systems engineering proven experience converting large ambiguous data sets into compact visualizations and transitioning code and requirements to developers experience defining architecting end-to-end data flow and data management systems and multi-system multi-software interface definition & documentation strong experience working directly with non-technical customers and translating requirements into actionable prototypes and developing formal requirements for such prototypes strong knowledge and experience of software development modern database and indexing theories strong knowledge and experience of different data storage search and retrieval models experience with any amazonweb services (amazon simple db amazon s3 amazon ec2 and or amazon sqs) required qualifications: bachelor's degree in computer science or similar technical fields experience with; cloudera and or hadoop experience with web protocols soap rss and other publishing tools experience with any amazonweb services (amazon simple db amazon s3 amazon ec2 and or amazon sqs) #cjpost #lipost department: information technology - provided by dice analysis analyst data analysis database development hadoop management mathematical metrics modeling protocols security soap testing company descriptiondenodo technologies is a leader on enterprise data virtualization software at denodo we embrace diversity we enjoy the challenge of being outside our comfort zone we thrive in dynamic environments and at the risk of sounding cliché we work hard and we play hard whatever it is our employees of all ages and nationalities share a vibrant and optimistic view that life and also data integration can be made much better we're busy working with some of the most leading and exciting technologies in the area of data integration – including hadoop nosql databases cloud services data warehouses data visualization and analytics tools denodo's growth as a business depends on being innovative and creative on delivering the best solutions with the highest levels of customer satisfaction and on having a unique piece of technology to solve real customer problems and a company can only be as forward-thinking as its people which explains why we have become the leading developer of data virtualization data services and cloud data integration technologies and solutions for the enterprise we understand our customer’s pain points and we are dedicated to help our customers get timely access to data in a world of fragmented repositories diverse technologies and rapid data growth we have witnessed their success as they embrace and experience data management through data virtualization we believe in eliminating data congestion we believe in the “liberation of data” and now we are the catalyst for fostering the enterprise use of data and for tapping into new data sources like nosql hadoop web cloud and the internet of things (iot) today denodo has become a global data force we have become the key enabler of best data integration practices around the world we carry our message and our solutions everywhere from south africa to norway and brazil to japan we love to team with our customers we share their experiences and we solve their problems and we do that with a service culture that has been the cornerstone of our business denodo was founded in 1999 with its headquarters in the silicon valley and operates globally from its offices in palo alto new york chicago and boise in the usa and london munich madrid and a coruña in the eu and chennai and singapore in apac as well as through a network of global and local partners worldwide job descriptionyour opportunitylove to deal with multiple technologies in complex environments? want to collaborate in solutions for the largest organizations in the world? ready to advise the modern software engineering practice on pioneer projects?we invite you to materialize your aspirations in a company where opportunities abound for building an amazing future where discovering the value and potential of a new technology data virtualization will be the beauty of your daily routine our customers span all market segments and geographies and we partner with leading and reputable companies across the globe this combination makes denodo a fun and exciting environment to work a place where your work will have an impact a place where you're invited to push the boundaries of possibility join an international team in pursuing mastery value and success in a global environment your duties and responsibilitiesas an associate data services engineer you will successfully employ a combination of high technical expertise troubleshooting skills and communication between clients partners and internal denodo teams to achieve your mission customer engagement:diagnose and resolve client inquiries related to operating denodo software products in their environment manage client support cases on a daily basis respond to client-reported issues in a timely manner and or per service level agreements communicate progress of resolution and status in a timely fashion and or per service level agreements engage clients in the product configuration and use of the denodo platform product and technical knowledge:obtain and maintain strong knowledge of the denodo platform constantly learn new things and maintain an overview of modern technologies contribute to knowledge management activities and promote best practices provide timely prioritized and complete customer-based feedback to product management sales support and or development regarding client’s business cases requirements and issues organizational skills:know when and where to escalate within the denodo technical organization to help clients and other technical specialists increase their efficiency when using denodo products build interpersonal relationships with other denodo teams qualificationswe require:solid understanding of sql and good grasp of relational and analytical database management theory and practice please note our interview process will test your proficiency bachelor's or master's degree in computer science or computer systems good knowledge of software development and architectural patterns experience in windows & linux (and unix) operating systems in server environments professional curiosity and the ability to enable yourself in new technologies and tasks creativity finding root cause to issues and providing solutions active listener team worker good written verbal communication skills for interaction with clients making presentations attending meetings and writing technical documentation being available for working outside of normal business hours holidays and some weekends when needed for on-call periods we value:familiarity with enterprise architecture and application development and infrastructure understanding of data integration flavors technical skills including java development jdbc xml web service related apis (e g json) experience with version control systems (e g svn git) authentication systems (e g ldap kerberos saml) experience in big data nosql and inmemory environments is welcome experience in cloud infraestructures foreign language skills are a plus additional informationemployment practiceswe are committed to equal employment opportunity we respect value and welcome diversity in our workforce we do not accept resumes from headhunters or suppliers that have not signed a formal fee agreement therefore any resume received from an unapproved supplier will be considered unsolicited and we will not be obligated to pay a referral fee position: it product owner: data and reportingposition summarythis position reports to the it director of applications and is responsible for developing and providing leadership in our client's data and reporting team this candidate will work on defining developing and implementing data warehousing and data mart solutions as well as identifying and implementing bi and reporting tools knowledge about data storage structures and about the various ways that data can be applied to aid in business intelligence and business decisions is required essential functions:• lead a team of data and reporting specialists (3 – 7) to develop and enhance data warehouse and data marts for enterprise-wide use• must have strong analysis and decision-making skills with the ability to lead meetings participate in technical cross-functional sessions ensure staff follows structured audited procedures and ensure adherence to change and configuration management principles • develop and communicate enterprise-wide data governance strategy to support corporate strategy• ensure projects are proceeding on schedule and complete on-time within budget within industry standards and prepares regular status reports • provide database and application solution strategy technical design architecture and support for carrying out the implementation of enterprise database and data warehouse development• work with large amounts of data on a granular level from structured and unstructured data sources• provide proper documentation and generating data features required for modeling reporting and ad hoc analysis• participate in various structured and ad-hoc analysis projects• develops and maintain high-performance data warehouse and data marts by regularly analyzing bottlenecks and remedying them• establish goals and priorities timelines and interface with senior leadership and other business units for successful completion supervisory responsibilities:1 has direct supervision of a team of data and reporting specialists this would include hiring firing and talent development minimum qualifications requirements:1 job knowledge and skill: excellent time management skills with the ability to multi-task and prioritize day-to-day responsibilities; detail oriented highly focused and very organized; ability to maintain a high level of energy and enthusiasm; excellent teamwork skills with the ability to establish and maintain positive and effective working relationships; experience developing a program road-map of projects to achieve business goals and deliver early business value as well as execution of the projects in the program;develops and executes projects under the program while managing the program road map and priorities budget and benefits; must be able to manage a budget2 experience:• 7-15 years of experience with information technology programs and services with demonstrated expertise in enterprise data management and related technologies• 3+ years’ experience in implementing data warehouse solutions etl analytics and reporting• at least 1-year management or supervisory experience• proficient knowledge of data warehouses technologies and design methodologies• in-depth understanding of database design principles• strong interpersonal skills with the ability to work effectively as part of a project or program team and foster team cooperation• experience with bi reporting tools such as qlik and tableau• experience with oracle and or postgres preferred• proficient working knowledge of sql and or pl sql languages• understanding of the agile scrum methodology3 education licenses certificates registrations: bachelors' degree in applied mathematics statistics computer science engineering or related field is required advanced degree (ma msc mba equivalent or higher) is preferred4 computer skills: working knowledge of power point word processor spreadsheet other pc applications ability to adapt to new technology as it becomes available5 communication skills: ability to excel in a team environment required ability to communicate with colleagues at all levels of the company outside vendors and customers both verbally and in writing 6 analytical skills: strong analytical and problem solving skills required 7 travel: some travel could be required 1002098brreq id:1002098brcompany summary:walmart global ecommerce is comprised of walmart com vudu samsclub com and our technical powerhouse @walmartlabs here innovators incubate next gen e-commerce solutions in real-time we integrate online physical and mobile shopping experiences for billions of customers around the globe how do we do it? we continuously build and invest in new technology including open source tools and big data innovations data scientists front and back-end engineers product managers and web and ux ui teams collaborate alongside e-commerce experts to envision prototype and bring revolutionary ideas to life in a dynamic flexible and fun work culture job title:big data cloud solutions architectposition summary:the walmartlabs big data platforms team is seeking a big data cloud solutions technical architect to serve in a consulting role for the big data components of walmart’s major application and project initiatives we’re looking for an architect that has extensive experience building big data applications using the hadoop ecosystem and related technologies both on traditional clusters and cloud platforms to collaborate with our internal product development teams constructing scalable and performant big data applications using cloud-based infrastructure responsibilities include:* collaborating closely with application team architects and engineers to identify technologies and platforms suitable for their big data processing requirements and then assisting those teams with onboarding development deployment and debugging on those platforms* investigating new big data tools and technologies for their potential application to common use cases; establishing best practices developing design patterns and writing documentation to disseminate new capabilities to a broad technical audience; working with platform engineers and product managers to specify and deliver new major technology features* providing technical assistance to a broad community of big data infrastructure users such as software application engineers and data scientists through research investigation collaboration and hands-on debugging often driven by specific use case requirements* ensuring that application big data solutions adhere to best practices and enterprise standards for scalability availability efficiency data lifecycle management information security fault tolerance and disaster recoverycity:apexstate:ncposition description:+ assists in the development of engineers and architects+ demonstrates and proves hardware or software technology concepts to other architects and management+ develops and implements product development strategies using agile development processes+ drives the execution of multiple business plans and projects+ ensures business needs are being met+ identifies and implements strategies for service enabled technologies (for example service oriented architecture)+ improves the hardware or software technology environment+ leads the creation and implementation of hardware or software technology solutions for the division+ oversees the development of conceptual logical or physical hardware or software designs+ promotes and supports company policies procedures mission values and standards of ethics and integrity+ provides supervision and development opportunities for associatesminimum qualifications:* experience with software design process considerations throughout stages including requirements architecture design development quality assurance deployment and maintenance * familiarity with big data concepts hadoop ecosystem components and complimentary technologies such as hdfs hive spark hbase oozie and kafka; as well as cloud technologies such as block storage object storage computational infrastructure services and higher-level database services* experience with scripting in languages such as python bash perl; or software development in java or scala* familiarity with the differences between traditional hadoop clusters and cloud-native hadoop platformsadditional preferred qualifications:* hands-on experience with writing debugging and optimizing big data processing applications using hadoop streaming hive or spark; odbc jdbc connectivity such as hiveserver and thrift; and streaming data management using kafka* familiarity with the strengths weaknesses and idiosyncrasies of big data solutions of cluster-based platforms (e g hortonworks cloudera and mapr) as compared to as cloud resource providers (e g google cloud platform microsoft azure and similar object storage and ephemeral compute paradigms)* experience building and nurturing long-term technical advisory and consulting relationships with software engineering teams* demonstrable proficiency with linux tools at a shell command line a basic understanding of the java build processes used to compile and package hadoop and a justifiable preference between one of the following: vim emacs bash fish or screen tmuxcategory:software development and engineering division:walmart labsdivision summary:@walmartlabs is the technical powerhouse behind walmart global ecommerce we employ big data at scale -- from machine learning data mining and optimization algorithms to modeling and analyzing massive flows of data from online social mobile and offline commerce we don’t just engineer cool websites mobile apps and new services; we use our own open source tools to create the framework deployment is automated and accelerated through our open cloud platform this makes us incredibly nimble and able to adjust in real-time to our global customers employment type:full timerequisition template:ecommerce we are looking for only w2 candidates no c2c position- software database engineer 2location- redmond wabeyondsoft is a global it consulting solutions and services provider founded in 1995 and headquartered in beijing china beyondsoft has over 30 nationwide offices r&d bases and delivery centers as well as facilities in united states japan india canada and singapore beyondsoft is now focusing on using emerging disruptive technologies like cloud mobility big data and analytics to provide powerful solutions and products for clients in a wide range of industries including: high-tech ecommerce finance automobile retail logistics energy manufacturing healthcare telecommunications media & entertainment and travel designs and develops ssas tabular models aligning these solutions to our team's best practices for performance usability architecture and overall standards communicates and coordinates with external and internal teams responsible for converting requirements into deliverables and ensuring on time and accurate work is completed for our customers requirement:experience with azureexperience with azure analysisexperience in machine learning or data science capabilities top 3 must-have hard skills:experience with ssas tabular modelsexperience with daxhave to know sql and have worked with large data setswe are an equal opportunity employer and value diversity at our company we do not discriminate based on race religion color national origin gender sexual orientation age marital status veteran status or disabilitycontact directly for quick response:jimmy rizvi | 425-358-6391| jimmy rizvi@us beyondsoft com east west bank is seeking a data management & integration analyst the data management & integration analyst is responsible for designing building and maintaining solutions for managing and integrating data between operational systems data repositories and reporting and analytical applications this position provides strong analytical and technical supports and interacts with project teams it teams and business stakeholders to drive business intelligence and data strategy across a wide range of projects work with business and technology teams to accurately gather and interpret requirements specifications data models etc for developing data integration and reporting solutions act as liaison between it and business units to provide solutions and business requirements for internal application development provide business requirements (logics) for report automation maintain functional and technical artifacts including design documents data mappings data architecture data models and data dictionaries work with the database development and business intelligence teams to design implement and support end-to-end data solutions assess and document source data and quality and coordinate with the business and technology teams to identify and resolve issues bachelor’s degree required preferably in information systems and computer scienceexperience with an object oriented language such as c# java or python experience both consuming and contributing to a ms sql server hosted data warehouse strong proficiency with excelanalytical and problem solving skills including troubleshootingable to work under pressure while managing competing demands and tight deadlineswell organized with meticulous attention to detailcan-do attitude self-motivated and strong work ethicself-driven to identify areas of improvementmust be team-oriented with experience working on interdepartmental team projectsproduct knowledge in loansdepositsbanking operations evo is seeking a senior big data engineer for our beaverton client as a sr big data engineer you will work with a variety of talented teammates and be a driving force for building solutions for digital you will be working on projects related to consumer behavior commerce and consumer touchpoints this contract opportunity is scheduled to be 1-year duties:design and implement distributed data processing pipelines using spark hive python and other tools and languages prevalent in the hadoop ecosystemability to design and implement end to end solutionexperience publishing restful api's to enable real-time data consumption using openapi specificationsexperience with open source nosql technologies such as hbase dynamodb cassandrafamiliar with distributed stream processing frameworks for fast & big data like apachespark flink kafka streambuild utilities user defined functions and frameworks to better enable data flow patternswork with architecture engineering leads and other teams to ensure quality solutions are implements and engineering best practices are defined and adhered toexperience in business rule management systems like droolsskills and requirements:ms bs degree in a computer science or related discipline6+ years' experience in large-scale software development3+ year experience in big data technologiesstrong programming skills in java scala python shell scripting and sqlstrong development skills around spark mapreduce and hivestrong skills around developing restful api'snot a fit for you but know someone that might be? refer them! we have a great referral program where you can earn up to $375 per referral find out more at www evosolutions com refer applicants must be fully authorized to work in the u s and physically be in the u s corp-to-corp requests will not be entertained relocation assistance will not be available for this position evo is an equal opportunity employer and considers qualified applicants for employment without regard to race gender age color religion disability veteran status sexual orientation gender identity or any other protected factor overall purpose: this job requires special approval from the it job ladder owner in compensation this job is found in the technical big data organization only responsible for the development of high performance distributed computing tasks using big data technologies such as hadoop nosql text mining and other distributed environment technologies familiarity with jvm-based function languages including scala and clojure; hadoop query languages including pig hive scalding cascalog pycascading; along with alternative hdfs-based computing frameworks including spark and storm are desirable key roles and responsibilities: uses big data programming languages and technology writes code completes programming and documentation and performs testing and debugging of applications analyzes designs programs debugs and modifies software enhancements and or new products used in distributed large scale analytics and visualization solutions interacts with data scientists and industry experts to understand how data needs to be converted loaded and presented works in a highly agile environment job contribution: seasoned technical professional contributes through proven technical expertise has significant dept functional impact knowledge subject matter expert (sme) within own discipline specialty area; basic knowledge of other disciplines specialty areas deep technical knowledge applies in depth knowledge of discipline specialty area standards processes integrates industry experience and deep professional technical knowledge technical leader and recognized sme on select at&t technologies systems procedures analysis problem solving solves unique problems through evaluative judgment precedent independently applies sophisticated analysis in evaluating issues develops new concepts methods techniques for cross functional initiatives recognizes pursues alternative methods independence guided by department goals objectives exercises latitude in determining objectives approaches to projects leads multiple projects of small to medium size and technical breadth contribution to at&t technology key contributor on complex projects initiatives impacts current and future business opportunities through application of specialized technical industry knowledge develops methods techniques based on strategic project objectives communication mentors and provides technical guidance and explains advanced concepts to others in work area coordinates across multiple departments promotes active listening and open communication provides leadership guidance to others education: bachelors of science in computer science math or scientific computing preferred experience: typically requires 5-8 years experience supervisory: no **principal functional skills competencies associated with this title:**+ agile development+ application design architecture+ application development (web)+ application development tools+ application programming interface (api)+ big data software engineering+ data mining and data science+ emerging technologies+ information security management+ programming+ requirements analysis+ software development life cycle+ statistics and actuarial modeling+ testing+ user interface designnote: additional skills competencies may be added to this specific requisition during the application process you will be asked to provide your proficiency and experience with all the skills competencies associated with the requisition click here to view this job description in career intelligence at http: careerintelligence web att com cip view main html# jobprofile 40492205job code - 40492205 founded in 1909 mutual of omaha is a solid family-oriented company that is reliable trustworthy knowledgeable and caring we are a full-service multi-line provider of insurance and financial services products for individuals businesses and groups throughout the united states we are committed to providing outstanding service to our policyholders our commitment to customer service is the cornerstone of our vision and values in an environment where everyone seeks after continuous improvement every day mutual of omaha's information service operation seeks individuals who are technically proficient highly engaged passionate creative innovative and flexible these positions will play an integral part of a dynamic group that provides value for our customers by leveraging technology to innovate solutions and transform processes qualified individuals will be accountable for managing a high performing team of critical individuals leading the enablement of one of our core strategic platforms of our data strategy and target architecture this will entail the management of our enterprise data lake in hadoop and the corresponding services enabling data management as well as advanced analytics and data science capabilities this includes working closely with our data analytics practice leaders as well as strategic partners to enable this next generation platform for data and analytics essential functions:+ the information services manager position will lead ongoing development and support efforts for the enterprise data lake platform & services team within the enterprise data management and architecture area + this individual will be responsible for managing a high performing team of critical individuals leading the enablement of one of our core strategic platforms of our data strategy and target architecture this will entail the management of our enterprise data lake in hadoop and the corresponding services enabling data management as well as advanced analytics and data science capabilities this includes working closely with our data analytics practice leaders as well as strategic partners to enable this next generation platform for data and analytics + specific responsibilities of the team include:+ implementing analytics environments leveraging big data technologies including integration with existing data and analytics platforms and tools + designing and implementing data pipelines on big data platform to enable rapid prototyping and accelerating the path to production + developing and configuring big data technologies for ingesting transforming storing and serving data at scale + re-architecting and rationalizing traditional data environments using big data technologies + creating data management solutions covering data security metadata management multi-tenancy and mixed workload management on big data platforms + this person will work closely with the sponsors partner teams and other information services teams to deploy strategic & practical solutions through existing and or new systems technology and will be responsible for the development testing and ongoing support of multiple applications and business processes + oversees and manages the analysis design and installation of information systems and computing technology infrastructure to contribute to the efficient and effective achievement of business objectives + provides oversight for vendor solutions including performance monitoring and vendor relationship management actively builds strategic partnerships with vendors to deliver business capabilities where applicable + integration coordination and management will be critical as this individual will be leading the contracting development and support activities associated with saas and on-premise solution offerings + communicates regularly with business partners it system leadership steering committees and other stakeholders to ensure functionality meets or exceeds needs and to develop ongoing strategy roadmaps and project plans + previous experience in creating strategy and implementing technology and or processes associated with self-service analytical tool enablement data analytics data management business intelligence and related fields is considered important + plans staffs directs and controls team activities this includes all management functions and decisions such as budgeting for and controlling expenses interviewing and selecting personnel appraising performance administering salaries developing and coaching subordinates long-range and short-term planning and ensuring compliance with corporate affirmative action guidelines + articulates a strategic vision which is clear and based on customer and i s requirements and technology architecture direction communicates direction in an appropriate manner with clarity to a wide variety of individuals develops and implements the changes needed to achieve the vision + achieves results through appropriate use of personnel technology and processes provides proactive leadership to projects assessing the overall viability or long-term impact to projects when major changes occur + functions performed are affected by advances in information systems technology and by changes in corporate information systems requirements caused by economic fluctuations insurance industry trends + government regulations and other factors -the scope of this position holder's responsibilities range in size from satisfying diverse needs of a single customer to developing a large complex multiple-function system within a highly critical environment minimum qualifications:+ stays current with emerging technologies industry best practices in the solution domains + shows initiative is a self-starter able to handle multiple priorities and has excellent time management skills + strong and effective written verbal and presentation skills with the ability to collaborate with team members and business stakeholders at all levels of the organization + precise attention to detail with a positive attitude and healthy sense of humor + must be comfortable with juggling multiple priorities with the ability to manage time effectively + high level of maturity; able to handle change and stress with ease + excellent problem-solving analytical and investigative skills + develops and coaches assigned personnel toward the achievement of career goals and plans while meeting the objectives of the company facilitates horizontal development (growth within a given job level) and vertical development (movement of strong performers to the next job level) where applicable + selects trains and appropriately uses staff resources aligns or acquires resources needed to deliver effective results builds a versatile and flexible organizational unit successfully utilizes contract employees where applicable + builds and maintains effective relationships with customers and suppliers exhibits effective technical business skills and understanding to conceptually discuss the technologies processes required to meet customer business requirements + formalizes understands and negotiates customer supplier priorities balances immediate problems with established tactical strategic i s and customer plans + successfully understands the impact of technologies on company problems and processes treats customers with respect dignity and appreciation adds value to customer interactions + ensures commitments are met exploiting the appropriate defined architecture and infrastructure effectively guides and executes project control techniques guides development of sound business cases + functions as a positive role model strong coach and change agent builds a team that understands corporate operation area and team visions and strategies motivates inspires energizes and helps people overcome barriers + creates a tactical plan that reflects customer needs and i s’s ability to meet them participates in the development and maintenance of unit area division corporate and (where applicable) customer plans tracks executions forecasts balances and adjusts operating work plans anticipates problems and opportunities while operating the planning process to successfully modify plans to accommodate changes and document correct problems accommodates unplanned circumstances while maintaining focus on tactical strategic objectives searches for and recommends suggestions for expense reduction within their team area and division + consults and participates in the development and or selection of technical solutions as required understands concepts associated with the i s architecture and information services industry understands the impact of technologies on company problems and processes + manages the orderly introduction and change of technology to the company proactively seeks new opportunities to creatively apply technology and knowledge within team area and division + actively supports and understands i s core processes guides the operation of the processes may actively participate in the construction enhancement and deployment of core processes searches for and recommends suggestions for process improvements within team area and division + understand customer business processes participates with customer management as requested to define review or modify business processes understands how specified technologies can enable business processes + pursues operational excellence and continuous improvemen now hiring!data warehouse architect: avamere health servicesfull-time position available! please apply online at: https: careers-avamere icims com avamere health services25117 sw parkway ste bwilsonville oregon 97007www avamere com mandatory 10 or more years of in depth hands on experience with enterprise data warehouse design and development using recent versions of wherescape red sql server etl api’s stored procedures data marts and everything in between including working with third-party relational and other data sources lead the design and development effort to deliver both raw and processed data to a team of data analysts apply best practices for release management and version control create project documentation for both it business unit and stakeholder use prioritize features and manage business unit expectations develop and maintain architectural solutions covering data structures integration security and retention coordinate with other it team members regarding infrastructure needs transform business and technical requirements into secure and scalable solutions develop standards for data acquisition governance warehousing archival recovery and destruction lead data model reviews with project team members establish data strategies and architectures that comply with regulatory and security compliance experience with wherescape red microsoft power bi and or oracle required experience in the healthcare industry ensuring security and compliance with appropriate hipaa standards required experience building bi solutions for hr finance departments and clinical operations a plus thorough understanding of relational and dimensional database domains along with a broad knowledge of it infrastructure exceptional communication skills (both written and verbal) interpersonal skills and experience collaborating with business and technical teams including executive management and stakeholders bachelor’s degree in computer science engineering information systems or related field desired experience with pci-dss soc 1 and soc 2 a plus strong problem solving analytical and troubleshooting skills detail oriented self-starter with focus on quality results versatile team player as a part of the avamere family of companies we embrace our mission “to enhance the life of everyone we serve ” being a part of avamere provides us with comprehensive clinical resources such as therapy home health and hospice care to best serve our independent and assisted living residents the avamere family of companies takes a holistic approach to post-acute care we have designed a service that combines all of healthcare companies and healthcare professionals tied by the common goal of providing seamless care and support for all of our patients and residents **req id:** 105432**basic purpose** :the enterprise data & analytics group at love’s travel stops is looking for a big data developer to be a part of a team that designs and develops big data solutions that meet business objectives this is an exciting opportunity to work for a family-owned company that continues to experience growth and get in on the ground floor to help build the company’s big data practice the ideal candidate has a deep technical knowledge of the hadoop stack and possesses a desire to push the business further through innovation this role requires a close partnership with the data science analyst community as well as various it teams to ensure requirements are met and solutions are supportable and scalable **major responsibilities:**+ design and implement data ingestion techniques for real time and batch processes for a variety of sources into hadoop ecosystems and hdfs clusters+ visualize and report data findings creatively in a variety of visual formats that provide insights to the organization+ knowledge of data master data and metadata related standards processes and technology+ define and document architecture roadmaps and standards+ drive use case analysis and solution design around activities focused on determining how to best meet customer requirements within the tools of the ecosystem+ ensure scalability and high availability fault tolerance and elasticity within big data ecosystem+ architect and develop elt and etl solutions focused on moving data from highly diverse data landscape into a centralized data lake; also architect solutions to acquire semi unstructured data sources such as sensors machine logs click streams etc + manage all activities centered on obtaining data and loading into an enterprise data lake+ serve as an expert in efficient etl data quality and data consolidation+ stay current with vendor product roadmaps and make recommendations for adoption+ maintain a customer-focused attitude**education and requirements:**+ education:+ bachelor’s degree or equivalent in information technology computer sciences or computer engineering+ experience:+ 8 years it experience+ 3+ years of experience building large-scale data solutions involving data architecture data integration business intelligence and data analytics+ 1+ year of experience working on large scale big data projects+ deep technical knowledge of most components contained within the hadoop ecosystem (mapreduce hdfs yarn hive hbase sqoop etc ) preferable with hortonworks distribution+ experience building streaming analytics solutions using nifi storm or other similar technologies+ understanding of statistical and predictive modeling concepts a plus+ strong java j2ee experience+ experience with visualization tools+ experience with rdbms platforms such as sql server and in-memory columnar storage such as hana**skills and physical demands:**+ skills:+ ability to manage numerous competing demands in a fast paced environment+ excellent verbal and written communication skills+ typical physical demands:+ requires prolonged sitting some bending and stooping + occasional lifting up to 25 pounds + manual dexterity sufficient to operate a computer keyboard and calculator + requires normal range of hearing and vision **job function(s):** information technology**_clean places friendly faces _** _it's been the guiding principle at love's for more than 50 years and it's leading us into the future we're passionate about serving drivers with clean modern facilities stocked with plenty of fuel food and supplies love's has two primary kinds of stores our_ **_'country stores'_** _are fueling stations with a convenience store attached the larger '_ **_travel stops'_** _are located on interstate highways and offer additional amenities such as food from popular restaurant chains trucking supplies showers and more _ the following skills knowledge below is a plus: data migration· aws:§ s3§ ec2§ iam§ emr§ cli§ redshift· python (optional)· salesforce (administrator and developer)· scala· apache spark (specifically mentioned sql etl)· apache hive· apache parquet· data science knowledge and experience§ data transformation - refining and curating§ data dictionary· saas technologies - for data archiving decommissioning database - provided by dice aws migration data architect do analytics big data the iot and data science interest you? want to boost your career through the use of cutting edge analytic tools techniques and technology? join a dynamic team of data-driven solution makers! join the north carolina state health plan's data analytics team!the north carolina state health plan for teachers and state employees a division of the department of state treasurer provides health care coverage to more than 700 000 teachers and local school personnel state employees retirees current and former lawmakers state university and community college faculty and staff and their dependents the data analytics unit builds and delivers business intelligence solutions that transform data into critical information and knowledge that empower the nc state health plan to make informed data-driven decisions the department of state treasurer's campus is located in raleigh nc off atlantic avenue is surrounded by many shopping centers and restaurants has access to a 24-hour 7-day a week free gym on the campus offers a competitive benefit package and has free employee parking this position will act as the primary contact and subject matter expert for data and information assetsmanage all recorded information physical and electronicmanage the uploading downloading and organization of information assets to improve data flow within state health plandevelop and maintain vendor relationships in order to maintain latest file layouts data dictionaries reference tables etc facilitate the development and implementation of data quality data protection and data usage standards across state health planreinforce data governance charter as established by the data governance board drive compliance with data standards principles metrics processes related tools and data architecturemaintain complete concise and contemporary documentation including physical data model schema data specifications documents mapping documents architecture documentation bi reporting documents data flow charts source code for business queries business report outputs etc organize the data analytics r drive and files saved thereto using tagging normalized taxonomy etc monitor terminal server as well as r and s drive for space availability on a recurring basis archive or delete files as appropriatedefine indicators of performance and quality and ensure compliance with data related policies standards roles and responsibilities and adoption requirementskeep current with trends and issues in the it and analytics industry knowledge skills and abilities competencies: experience in data management data warehousing business intelligence and or master data management metadata managementbusiness analysis experience (composing documents research testing metric development data analysis report development etc )excellent oral and written communication skillsstrong attention to detail minimum education and experience requirements:bachelor's degreeexperience with sas - provided by dice data management data warehousing business intelligence metadata management sas u s anesthesia partners (usap) is the largest single-specialty anesthesia practice in the country with over 4 000 clinical providers and associates by joining our team you will participate in a highly collaborative and dynamic environment as an organization we are mission focused on delivering the highest quality in patient care and you will be directly supporting our talented clinical team we extend this same commitment to quality to our associates and supply tools and resources that will ensure we win in the healthcare marketplace and support of usap we are proud of our inclusive people culture that supports our associates to perform at their best usap is an equal opportunity employer candidates with physician services or related health care experience is a plus we offer a competitive benefits package position summary: the edw sr data architect leads the data team architecture design and development activities for usap directs edw data team developers in developing and enhancing usap etls for new and existing regions leads the requirements and design process for a re-architected edw and data marts of desktop help desk technicians and systems administrators ensures ongoing viability of the usap data warehouse supporting etls and reports communicates requirements for edw improvement to usap leadership and builds business cases to present value proposition relies on extensive experience and judgment to plan and accomplish goals essential duties and responsibilities (include but not limited to):responsible for the overall effectiveness of the usap enterprise data warehouse (edw) and edw operations to ensure high levels of user satisfaction and availability work with software development team(s) and infrastructure teams to provide comprehensive solutions to usapworks with business users to define requirements develop solutions and communicate to the broader organizationwork with external vendors consultants on specific edw projects and new technology initiatives when necessary ensure the security and recoverability of edw etls reports and data leads the ongoing upgrade and enhancement of the usap edw including licensing versions hardware capacity and ongoing performancedevelops best practices policies and procedures to ensure quality and consistency of edw etls and reports consolidates and optimizes the current data warehouse ensure all data warehouse code is maintained in a version control system oversee design and implementation of etl procedures for intake of data from both internal and outside sources and acquisitions; as well as ensure data is verified and quality is checkeddesign and implement etl processes and data architecture to ensure proper functioning of analytics including powerbi and other bi tools reporting environments and dashboardscollaborate with business and technology stakeholders in ensuring data warehouse architecture development and utilization is in alignment with usap requirements and directionscarry out and oversee monitoring tuning and database performance analysisperform and lead the design and extension of data marts meta data and data modelsconceive and design analytics and business intelligence platform architecture for usap lead the re-architecture efforts to develop the next generation usap edw and data martother duties as assigned by management job requirements (knowledge skills and abilities):creativity in problem solving and strong verbal written communication skills will be leveraged in this rolehigh proficiency in dimensional modeling techniques and their applicationstrong analytical consultative and communication skills; as well as the ability to make good judgment and work with both technical and business personnelworking knowledge of r code used in data processing and modeling tasks a plusability to communicate professionally with all levels of managementability to read write and speak englishus anesthesia partners inc provides equal employment opportunities (eeo) to all employees and applicants for employment without regard to race color religion sex national origin age disability or genetics education training experience:bachelor’s degree in computer science or an analytical related field including information technology science and engineering disciplineten or more years of experience in it or related fieldseven years or more experience performing data warehouse architecture development and managementexperience in re-architecting existing edwsextensive experience with technologies such as sql server ssis and stored proceduresextensive experience developing testing administering rdbms and monitoring of databasesignificant experience in working with business users to define requirements develop solutions and communicate to the broader organizationseveral years working experience with some of the following: powerbi tableau and other reporting and analytical toolssignificant experience with microsoft azure cloud computing platformexperience with big data technologies such as hadoop and impala a plus this going to be a contract till dec 2018 ( with possibility of extensions)this position will plan document coordinate architect and lead all activities related to enterprise data architecture and data warehousing continued adherence to standards principles and data governance guidelines as required to meet the needs of the organization’s business requirements this individual is also responsible for developing and maintaining a data architecture blueprint for the organization the data architect will work with business subject matter experts and is groups to document data attributes translate business requirements into technical requirements to ensure appropriate controls and verification for enterprise level data integrity core skills include database architecture data modeling (relational dimensional and hierarchical) ·develop and maintain an enterprise data model (edm) to serve as both the strategic and tactical planning vehicles to manage the enterprise data warehouse this effort involves working closely with business users ·collaborate with is teams to ensure quality and compliance by enterprise data architecture participate in data analysis design reviews and conduct walkthroughs at various stages during the development life cycle this includes providing data modeling expertise with diagrams of both relational and dimensional modeling techniques ·collaborate with the data warehouse team and the business partners in designing data marts ·emphasis on methodology modeling and governance define the it design methodology develop process to quickly incorporate model changes and adhere to data governance best practices ·guide educate and mentor the data architecture strategy directives principles & standards to individuals who play data-related roles (e g data analysts etl developers report developers & systems analysts) ·establish and maintain data standards policies and architectures ·provide direction guidance and oversight for data quality controls ·identify opportunities for standardizing data descriptions integration and archiving and elimination of unnecessary redundancy ·capture and maintain metadata creating business rules for the use of data ·other projects and duties as assigned educationbachelor’s degree in computer science software engineering or information technology preferred; demonstrated experience may be considered in lieu of degree required experiencesix or more years of direct experience as a data architect or data modeler with a strong data-centric background experience with health rules is mandatory expertise designing and implementing data architectures modeling analysis and profiling four or more years of data warehousing and business intelligence experience with relational database structures theories principles and practices data governance data quality management metadata management and conceptual and logical data design as a data visualization developer the primary responsibility is to deliver impactful dynamic and insightful business intelligence visualizations of healthcare data analytics for the cloud-based inovalon onetm platform using clinical financial and operational datasets plans identifies and designs data integration strategies develops intuitive designs based on industry and user research and builds enterprise interactive dashboards and web pages to guide users through healthcare data analytics;builds intuitive interfaces infographics and visualizations to tell stories with data;iterates prototypes implements and tests interfaces and visualizations using advanced design and development skills;utilizes interactive drill-downs movements and highlights to add layers of information through navigation and encourage users to explore data;use expertise and business intelligence tools including data visualization data aggregation and analysis wireframes high-fidelity visual comps and big data tools to deliver strong data-centric visualizations and user-friendly web applications interfaces for health care products;develop and maintain dynamic data visualizations and interfaces that present complex healthcare data in a simple and clear manner to end users;work as a technical resource and subject matter expert for visualization and computing platforms and act as a liaison between software engineering teams and business stakeholders;embrace the agile sdlc methodology participating and leading agile scrum ceremonies and work in a fast-paced agile environment to create value for inovalon inovalon’s customers healthcare providers and patients; andwork across multi-functional teams to formulate and fine-tune design ideas and guidelines bachelor’s degree in computer science engineering analytics informatics or a related field of study;at least 2 years of experience in the job offered or related occupation;in lieu of a bachelor’s degree and 2 years of experience employer is willing to accept a master’s degree in computer science engineering analytics informatics or related plus 1 year of experience in the job offered or related occupation; anddemonstrated visualization and web interface development experience including knowledge of tableau data warehousing and aggregation techniques machine learning data mining information theory graphical models and advanced data visualization demonstrated dashboard report skills using tableau or other similar data visualization tools to the business must also possess experience writing sql queries (or similar query syntax) and experience with big data tools inovalon provides equal employment opportunities (eeo) to all employees and applicants for employment all qualified applicants will receive consideration for employment without regard to race color religion sex national origin or protected veteran status and will not be discriminated against on the basis of disability we are hiring a sr data architect for a contract position in torrance ca for immediate details about this job call us at: sumit kumar (714) 576-7031 or michell casey (949) 860-4715 apply through our web site or email us direct at: sumit kumar@calance com mcasey@calance com ===================== ===================== ** local candidates only (southern california only) ** position: sr data architect job ref#: 31264 duration: 12+ months (on-going contract) location: torrance ca 90501 (on-site only) rate: open depends on exp level (w2 inc c2c) ** we will accept incorporated (inc) corp to corp (c2c) or w2 contractors for this position** ** local candidates only (southern california only) ** calance is a 1st tier vendor with 30 consultants working on-site for this client although this is a contract role the average consultant has been on project with this client between 6-9 years all work will be performed on-site and you must be available for a face to face interviews responsibilities include: influence projects initiatives and drive decisions related to data including data quality data architecture and data management best practices provide expert data modeling and data validation services that produce flexible extensible data structure design solutions that support effective business decisions map entities to use cases and business requirements and assist in the development of data services create and manage and array of data design deliverables including data models data diagrams data flows and corresponding data dictionary documentation develop standards for database design and implementation of various strategic data architecture initiatives around master data management data quality data management policies standards data governance and metadata management serve as an advocate for the data architecture team and data management discipline policies and standards across the enterprise assist in determining the effectiveness of existing technologies and processes in relation to the data architecture create necessary implementation migration plans and recommend new solutions as required daily tasks performed: lead key business critical projects in the capacity of a data architect and advisor define data architecture standards best practices and participate in governance activities lead key data architecture & data management initiatives review business requirements and technical design documents to develop effective data and database solutions create maintain and define conceptual logical and physical data models for various manufacturing projects using relational theory and dimensional modeling provide insight into logical business data requirement to work with dbas to support physical data model considerations define maintain and adhere to enterprise modeling standards develop etl specification and document data migration mappings and transformations for data warehouse loading designing and implementing architectural solutions which involve strategic and tactical business needs providing data modeling design and architecture principles and techniques across master data transaction data and derived analytic data apply best practices through relevant experience across data-related disciplines and technologies particularly for enterprise-wide data architectures data management data governance and data warehousing required skills experience: (resume must reflect this experience) 7+ years of experience as a data architect providing data modeling design and architecture principles and techniques across master data transaction data and derived analytic data experience creating maintaining conceptual logical and physical data models and data validation technical understanding of data modeling design and architecture principles and techniques across master data transaction data and derived analytic data experience with enterprise-wide data architectures master data management data governance and data warehousing experience with big data and big data on cloud experience using data modeling tools such as erwin or erstudio experience developing etl specification and document data migration mappings and transformations for data warehouse loading experience with relational database design and structured query language (sql) must reside locally (s california) must have a stable work history working in a similar role for a large environment excellent written and oral communication skills education: bachelor's degree or master's degree in computer science or applicable field of study desired skills: understanding of modern data warehouse concepts experience with big data platforms understanding of advance analytics needs such as predictive cognitive analytics concepts exposure to cloud based data & analytics solutions calance consultants are offered the following benefits: medical dental vision benefits 401k retirement program paid bi-weekly direct deposit flex spending plan voluntary life ad&d std or ltd plans the law enforcement practice at buchanan & edwards is a trusted advisor to our department of justice customer whom we serve in a dynamic and solution-oriented environment if you are a java developer who has experience and special interest in big data engineer and you enjoy the prospect of being part of a mission-focused collaborative program team who is interested in helping client organizations determine and execute it strategy and goals we would like to hear from you the java developer will be part of a team of developers who are developing the tool that provides our client single-source single sign-on search capability that pulls information directly from multiple data repositories filters results and quickly locates information that is most pertinent it allows our client to leverage data from 127 repositories if you spend most of your time engaged in java development and you have an interest in solr this is a role worth exploring must currently possess an active us government top secret clearance with the ability to obtain and maintain sci access within a reasonable customer-mandated time frame monitor and audit search application performance through load testing search testing and or query response time testing and report activities on an on-going basisconfigure and set up multiple solr cloud cores under centos and linux environmentanalyze unstructured data for ingest · design and develop ingest processesdesign and develops solr cloud schema for efficiencydesign and develop interface for application searchesdesign network architecture for solr cloud clusterdesign network architecture for cluster disaster recoverydesign and develop incremental update proceduresdesign and develop collection replacement strategydefine and configure solr cloud index schemas for multiple data elementscreate and schedule periodic refresh of solr cloud indexes using cron jobsbuild solr cloud queries for multiple search requirementsdesign and develop search engine monitoring toolscurrent ts clearance5+ years of experience designing developing and deploying java based applications and services3+ years of experience using and working with apache solr cloud3+ years of experience implementing solr cloud builds of indexes shards and refined searches across un-structured datasets to include architectural scaling3+ years of experience in architectural design and resource planning for scaling solr cloud capabilities3+ years of experience with apache zookeeper2+ years of experience of developing search services for api integration with products such as endeca and hadoop2+ years of experience with automated techniques and processes for bulk indexing of large-scale datasets residing in database or un-indexed systems 2+ years of experience working on scrum or other agile development based methodologies minimum education required:bachelor’s degree in computer science or related field or an equivalent amount of work experience plus a minimum of 7 years of related work experience desired qualifications:hadoop specific experiencefamiliarity with entity resolution technologyaccumulo experience working with a mapreduce implementation such as apache hadoopability to communicate the big data vision and strategy to technical business and end-user audiencesability to drive a solution to delivery from the ground-up “hands on ”version control experience with svn or gitexperience with the full agile software development lifecycle familiarity with architecture both logical and physical design experience with distributed queuing with technologies like kafkaability to mentor more junior team members#cjbuchanan & edwards inc (be) is an information technology and professional services consulting firm located in arlington va be is a diversified high-technology services company providing government commercial and nonprofit agencies technology solutions and organizational management services to ensure mission success serving the federal sector since 1998 we base our solutions on an in-depth understanding of our clients their mission and the unique challenges they face be is the winner of the 2015 microsoft u s federal solution partner of the year award a 2015 and 2016 washington post top workplace and has been an inc 500 5000 awardee for six consecutive years buchanan & edwards inc is dedicated to fostering cultivating and preserving a culture of diversity and inclusion we are committed to crafting a workplace that endorses creativity and innovation and promotes engagement through open communication acceptance of new people and ideas and a supportive team dynamic buchanan & edwards inc is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race color religion sex national origin disability status protected veteran status or any other characteristic protected by law eoe minority female veteran disabled buchanan & edwards inc is an e-verify employer company descriptionadvanced onion is seeking a new member to join our team of qualified and diverse individuals the qualified applicant will become part of advanced onion's team supporting the federal government’s mission to provide solutions for the department of defense the products delivered by this team actively help to ensure the safety of service members and their families around the world we are actively searching for qualified candidates for this potential opportunity we are currently identifying candidates for this future effort this position is contingent upon award of task order to advanced onion job description architect develop and maintain exposed data service applications (either publishing and subscribing sides) designed to use the latest web- and mobile-device technologiesparticipate in the full software development lifecycle (ecosystem) of mobile device applicationscollect and shape requirements for a mobile device-based ecosystem designed to support service oriented architecture (soa) and web 2 0 for analytical and mission needsdevelop and maintain applications that expose and consume web services using either native (api) or web-based mobile clientsexperience with ios and or android operating systemsexperience with html 5experience with simple object access protocol (soap) web servicesexperience with wi-fi cdma gsm 3g 4g lte or near field communications (nfc) technologiesexperience developing native (api) or web-based applicationsexperience with offline data-centric architectures that provide for disconnected intermittent and or limited (dil) communications​this position description is not intended as nor should it be construed as exhaustive of all responsibilities skills efforts or working conditions associated with this job this and all positions are eligible for organization-wide transfer management reserves the right to assign or reassign duties and responsibilities at anytime qualificationseducational requirements: bachelor's degree from an accredited college or university in computer science mathematics or engineering or a mathematics-intensive discipline fine arts or graphic design or an applicable training certificate from an accredited institution experience requirements: 2 to five years of intensive and progressive experience in a related field including design development and maintenance of web-based applicationssecurity clearance requirements:​required: ability to obtain and maintain a minimum of a position of trust clearance level or higher dependent upon client requirements desired: an active position of trust clearance (or higher: secret or ts) or valid eligibility additional informationbenefits preview: https: www zenefits com benefits-preview ?token=7856ebb2-c135-493e-bc01-9fb02d0c251edisclaimers:a competitive salary for each individual will be commensurate with experience and education as it relates to the position requirements unless specifically stated otherwise each position is onsite at the specified location due to regulatory security criteria all candidates must have a u s citizenship h1b visa holders h1b sponsorships and u s resident green card holders will not be considered check individual job opportunities to see if a security clearance is required applicants under final consideration for hire will be subject to a thorough background check and security clearance checks this position description is not intended as nor should it be construed as exhaustive of all responsibilities skills efforts or working conditions associated with this job this and all positions are eligible for organization-wide transfer management reserves the right to assign or reassign duties and responsibilities at anytime advanced onion is an equal opportunity employer your information will be kept confidential according to eeo guidelines global science & technology inc (gst) a growing scientific and high technology company is seeking a data access specialist to support noaas national center for environmental information (ncei) the position is located in asheville nc position summary:the data access specialist will provide science stewardship and support services for the data stewardship division (dsd) primary duties:the data access specialist shall maintain and develop applications configurations and procedures for cross-center data discovery dissemination and access services required education skills:bachelors degree in science or a related technical field minimum of 8 years of relevant experience with up to 16 years of total experience requisite abilities and experience in implementing best practices in data management while supporting multiple data product types (i e in situ satellite) on multiple data management and processing it infrastructure (ex ncei class espc) thorough knowledge of metadata and dois iso and data format standards and best practices required physical qualification(s):ability to use a computer mental qualification(s):must be able to effectively communicate technical information (written and verbal) and work status accurately and reliable to project leads and managers u s citizenship or permanent residency is required selected applicants will be required to complete a federal government background investigation you may also fax your resume to (301) 474-5970 if you need assistance please call (301) 474-9696 gst offers competitive salaries; vacation sick and holiday leave; major medical dental life long-term and short-term disability insurance; 401k retirement plan; tuition assistance; and opportunities for employee career growth and development all qualified applicants will receive consideration for employment without regard to age race color religion sex sexual orientation gender identity national origin disability veterans status gst is an equal opportunity affirmative action employer job description:**role summary purpose:**the candidate will develop the data provisioning approach and designs for data warehouses data marts and data lake the etl architect will ensure their designs comply with data architecture standards and procedures and align with established data architecture risk and control framework the candidate will be responsible for leading the development of reusable etl design patterns that accelerate analytic and innovation efforts the candidate will work closely with the business teams to evaluate and enhance existing controls to improve the quality of data provisioned in the analytic environments **essential responsibilities:**+ develop reusable metadata driven data integration design patterns that ensure consistent data provisioning processes and controls+ design provisioning processes to ensure the data are an accurate representation of the information required+ work with data office team to ensure that data are sourced from authorized data sources+ understand business requirement to prepare technical etl design specification+ define the etl and report schemas aimed at optimizing storage capacity and performance+ ensure all etl development is aligned with synchrony technology standards and best practice+ ensure all etl design and development are self-documenting; including data lineage capture+ work with business users data architects and data stewards to ensure solutions meet all requirements both in data availability and performance+ develop and maintain audit and validation processes to detect data integrity problems and work with developers internally and externally to solve data integrity issues+ perform other duties and or special projects as assigned**qualifications requirements:**+ bachelor's degree with minimum 5 years of experience in information technology or in lieu of the bachelor's degree minimum 7 years of experience in information technology+ minimum of 5 years of experience in modeling and business system designs+ extensive experience with data warehousing data architecture data quality processes data warehousing design and implementation table structure fact and dimension tables logical and physical database design data modeling reporting process metadata and etl processes+ agile experience using jira or similar agile tools**desired characteristics:**+ credit card payment experience+ strong background in financial services+ ability to work with lead & senior developers to troubleshoot problems and solve issues as needed+ experience with shell scripting unix linux+ experience in identifying & fixing performance bottle necks in etl & database processes+ up to date on current technologies in big data & data science space+ exposure to cloud data warehousing tools preferred+ experience working with data warehouses and big data platforms+ demonstrated experience building strong relationships with senior leaders+ strong leadership and influencing skills+ outstanding written and verbal skills and the ability to influence and motivate teams**eligibility requirements:**+ you must be 18 years or older+ you must have a high school diploma or equivalent+ you must be willing to take a drug test submit to a background investigation and submit fingerprints as part of the selection process+ you must be able to satisfy the requirements of section 19 of the federal deposit insurance act + if currently a synchrony financial employee you must have been in your current position for at least 6 months (level 4 – 7) or 24 months (level 8 or greater) have at least a "consistently meets expectations" performance rating and have the approval of your manager to post (or the approval of your manager and hr to apply if you don't meet the time-in-job or performance requirementlegal authorization to work in the u s is required we will not sponsor individuals for employment visas now or in the future for this job opening all qualified applicants will receive consideration for employment without regard to race color religion sex sexual orientation gender identity national origin disability or veteran status **reasonable accommodation notice:**+ federal law requires employers to provide reasonable accommodation to qualified individuals with disabilities please tell us if you require a reasonable accommodation to apply for a job or to perform your job examples of reasonable accommodation include making a change to the application process or work procedures providing documents in an alternate format using a sign language interpreter or using specialized equipment + if you need special accommodations please call our career support line so that we can discuss your specific situation we can be reached at 1-866-301-5627 representatives are available from 8am – 5pm monday to friday central standard time **grade level: 12**job family group:information technologywith roots in consumer finance that trace back to 1932 synchrony financial is a leader in consumer credit and promotional financing providing a range of products for a diverse group of national and regional retailers; including main street mainstays local merchants manufacturers buying groups industry associations and healthcare service providers we are the largest provider of private label credit cards in the united states based on purchase volume and receivables and we provide co-branded dual card credit cards promotional financing and installment lending loyalty programs and fdic-insured savings products through synchrony bank who do we serve? hundreds of thousands of customers across the u s and canada spanning the electronics and appliances home furnishings automotive power products and sports jewelry and luxury retail and healthcare industries our purpose is clear: we are committed to pioneering the future of financing improving the success of every business we serve and the quality of each life we touch this is fitting because when you join synchrony financial you’re joining an organization that recognizes that our people are our greatest asset —every single one of them that’s why we are deeply committed to investing in the growth of each member of our team and with 80 years of experience we know how to develop talent at synchrony financial we work hard to offer competitive rewards compensation and benefits when you join us you become part of a stimulating work environment with vast opportunities to sharpen your skills and embrace new leadership challenges **pwc los overview**pwc is a network of firms committed to delivering quality in assurance tax and advisory services we help resolve complex issues for our clients and identify opportunities learn more about us at www pwc com us at pwc we develop leaders at all levels the distinctive leadership framework we call the pwc professional (http: pwc to pwcpro) provides our people with a road map to grow their skills and build their careers our approach to ongoing development shapes employees into leaders no matter the role or job title are you ready to build a career in a rapidly changing world? developing as a pwc professional means that you will be ready- to create and capture opportunities to advance your career and fulfill your potential to learn more visit us at www pwc com careers it takes talented people to support the us firm of the largest professional services organization in the world not all of us work directly with external clients some of our best people choose to apply their talents inside pwc as part of internal firm services you're serving an organization on par with many of our external clients our internal firm services team consists of first-rate marketers human resource professionals computer technologists knowledge managers accountants financial planners administrators and leaders internal firm services staff are the people who make it work for the people who make it work for our clients **job description**the office of the chief data officer is charged with being the voice of data and generally representing data as a strategic business asset the primary role of this organization is to champion the use of data and information across the firm and drive changes and improvements in data related operations this office will help to enable the business as well as provide insights related to attendant risks significant effort will be dedicated to determining pwc data related needs and developing proposed solutions the chief data officer team identifies where when how and why the business uses data and transforms it into information that serves clients pwc's data management organization is an internally-focused team which is strategically aligned to the firm's priorities and passionately focused on maximizing the value of data as a strategic asset by maintaining data standards and improving data quality across the enterprise the team is responsible for assisting with implementing the overall strategic direction of the the data management organization and the information management strategy of our cross line of service and cross functional organization establishing and maintaining the highest possible standards for managing our master data will enable simplification reduce costs and increase the intrinsic value of the information the team manages the master data management (mdm) group works collaboratively across marking and sales lines of service ethics and compliance risk and quality office of strategic change finance firm leadership data owners and other enabling functions to ensure global alignment to our transformation initiatives around collaboration contact to cash global advisory and information share across our network the group works closely with a vast array of functional teams and line of service specific functional teams while serving as the center of excellence in order to integrate the information from either source or consuming systems mdm is also involved in designing and driving common elements through downstreams systems and processes **position program requirements**minimum year(s) of experience: 4minimum degree required: high school diplomadegree preferred: bachelor's degree or master's degree computer science information management data science business intelligence data visualization (e g ba bs ms) or equivalent experienceknowledge preferred:demonstrates extensive knowledge and or a proven record of success in the following areas:- understanding of data and the analysis required to align and conform data structures to integrate with the mdm enterprise data architecture (meda);- working with integration technologies sql sql server reporting services and excel data analysis;- understanding or the aptitude to understand how information flows from marketing and sales risk and quality finance as well as how information is used for reporting this is both territory specific and globally across the pwc network;- understanding of the various functional networks and an ability to build expand and or leverage existing relationships;- understanding the impacts of significant business and technical changes how they apply to current operations and ultimately establishing minimal disruption to the business process supported;- working with both business and technical resources to accurately align information assets into meda - project management and effective prioritization time management skills with the ability to handle multiple projects simultaneously;- working with data structures and their interrelated dependencies in navigating complex information needs; and - high level of client service modeled through the ability to develop work approach project plans and communications to meet exceed client requirements and establish maximum impact outcomes skills preferred:demonstrates extensive abilities and or a proven record of success in the following areas:- implementing strategic plans with experience in project management;- creating professional networks building strong relationships organizing and collaborating with individuals at all levels of seniority and lines of the organization;- managing setting priorities and collaborating across the delivery teams;- defining data enhancement projects establishing timely completion of standard tasks and maintaining responsiveness to the needs of the team members;- creating and conveying a value proposition (oral or written) including the ability to identify the issue (prepare and listen) share insight validate co-develop a solution and move to next steps;- using judgment to foresee technical issues based on approaches and have an understanding of the impact of issues ideas as they relate to the firm’s strategic initiatives and objectives;- facilitating and ability to coach teach others;- solving problems and conflict resolution using written and verbal communication skills;- managing budget and ability to collect and analyze data to communicate return on investment and impact; and - navigating and working effectively in a heavily matrixed organization all qualified applicants will receive consideration for employment at pwc without regard to race; creed; color; religion; national origin; sex; age; disability; sexual orientation; gender identity or expression; genetic predisposition or carrier status; veteran marital or citizenship status; or any other status protected by law hello greetings from xoriant! i am anusha i am reaching out to check your availability please review the job description below and submit your updated resume as soon as possible feel free to reach me at732-898-6749 also if someone else in your network is qualified for the position then please feel free to forward this mail to them positions : senior data engineerlocation:- sunnyvale caduration:- 6+ monthsposition summaryvery strong engineering skills should have an analytical approach and have good programming skills provide business insights while leveraging internal tools and systems databases and industry dataminimum of 5+ years’ experience experience in retail business will be a plus excellent written and verbal communication skills for varied audiences on engineering subject matterability to document requirements data lineage subject matter in both business and technical terminology guide and learn from other team members demonstrated ability to transform business requirements to code specific analytical reports and toolsthis role will involve coding analytical modeling root cause analysis investigation debugging testing and collaboration with the business partners product managers other engineering team must have strong analytical backgroundself-startermust be able to reach out to others and thrive in a fast-paced environment strong background in transforming big data into business insightstechnical requirements knowledge experience on teradata physical design and implementation teradata sql performance optimizationexperience with teradata tools and utilities (fastload multiload bteq fastexport)advanced sql (preferably teradata)experience working with large data sets experience working with distributed computing (mapreduce hadoop hive pig apache spark etc ) strong hadoop scripting skills to process petabytes of dataexperience in unix linux shell scripting or similar programming scripting knowledgeexperience in etl processesreal time data ingestion (kafka)nice to have development experience with java scala flume pythoncassandraautomic schedulerr r studio sas experience a plusprestohbasetableau or similar reporting dash boarding toolmodeling and data science backgroundretail industry backgroundeducation bs degree in specific technical fields like computer science math statistics preferred anusha guttikondarecruitervoip :732-898-6749 | anusha guttikonda@xoriant com - provided by dice only locals big data engineerlocation: stow ma (100% onsite)start date: asaprate: openphone: 510-962-4340 email: mazhera@techaspect com principal duties and responsibilities:as a software engineer focusing on big data you will work with it team to develop data platforms that turn big data into big insights with tremendous value this role requires knowledge and hands-on experience with big data technologies used throughout entire application stack include spark cloudera data hub and python scala r languages * design and develop etl pipelines connected devices web applications and mobile applications that support the customer experiences* collaborate with front-end and mobile app development teams on user-facing features and services* work with platform architects on software and system optimizations helping to identify and remove potential performance bottlenecks* focus on innovating new and better ways to create solutions that add value and amaze the end user with a penchant for simple elegant design in every aspect from data structures to code to ui and systems architecture* stay up to date on relevant technologies plug into user groups understand trends and opportunities that ensure we are using the best techniques and tools* work with other software leads on developing continuous integration (ci) pipeline and unit test automation* document the work you do especially apis that you createqualifications (demonstrated competence):* delivered the full lifecycle of a solution using hadoop (data ingestion to information availability) * delivered at least one big data solution using cloud services & open source * expert knowledge of programming languages such as java scala or python * ingested data using big data etl tools (apache spark) * implemented data security and privacy in a cloud environment * delivered solutions using agile methodology * delivered solutions within a global it enterprise highly desirable but not required skills include:* experience with cloud computing (amazon web services preferred) * experience with cloud computing services (amazon web services like ec2 dynamo s3 rds preferred) - provided by dice big data engineer big data consultant hadoop developer hadoop engineer hadoop consultant data scientist **are you looking to make a difference in your career? we?re working on smarter grids cleaner energy and tools to help people manage energy more efficiently ****about it**the role of it goes beyond the traditional information technology ?service provider ? many of the innovative ideas and projects that shape the company?s future and move sce forward are dependent on technology it employees are at the heart of these projects collaborating designing and executing technology solutions that are transforming our industry **position overview**the sr analytics architect & data engineer will work with the sr information and analytics manager to help develop and socialize business analytics strategy roadmap and architecture primarily in the areas of geo-spatial visualization and situational awareness the sr analytics architect & data engineer will also work with business analytics sme?s and data scientists to perform extensive data engineering and geo-spatial visualization design for advanced analytics and situational awareness use cases the selected candidate will be responsible for modeling complex analytical problems discovering insights and identifying opportunities through the use of statistical algorithmic data mining and gis visualization techniques in addition to advanced analytical skills the individuals in this role should be proficient at integrating and preparing large varied datasets architecting specialized datamart?s and analytical solutions overseeing analytical solutions deployment and communicating results the selected candidate?s responsibilities will also include driving continuous improvement of business analytics technical capabilities client consultations defining analytics requirements analytics solution evaluations recommendations impact assessments rapid solution prototyping planning data acquisition and preparation for advanced analytics the work also involves collaborating with business leaders to translate business strategy and requirements into analytical strategy architecture & solutions that align with enterprise architecture frameworks and strategic direction **typical responsibilities**+ develop and socialize business analytics strategy roadmap and architecture specifically for geo-spatial visualization and situational awareness requirements + identify & elaborate situational awareness analytics use cases goals and objectives+ state goals in business terms + state objectives in technical terms + define and state success criteria + develop analytics data mining project plans + continuous improvement of business analytics technical capabilities + client consultations technical solution evaluations recommendations impact assessments for analytics solutions + rapid solution prototyping for use cases specifically involving real-time analytics and geo-spatial visualization for situational awareness + use machine learning data mining gis visualization tools and techniques to create new scalable situational awareness solutions for business problems + develop and promote best practices for geo-spatial visualizations + make strategic recommendations on data collection preparation integration quality exploration and retention identify what data is available and relevant including internal and external data sources; leverage new data collection processes such as smart meters scada and social media; collaborate with information architects and data stewards sme?s to select relevant sources of data + facilitate selection and preparation of data to be used for specific use cases; develop and recommend data sampling techniques; develop data cleaning specifications and approaches + keep abreast of new and current geo-spatial real time situational awareness and big data techniques and develop approaches to adopt them **minimum qualifications**+ bachelor?s degree in a quantitative discipline such as statistics mathematics engineering computer science data science or information sciences or related technical discipline + eight or more years of it or technical experience + three or more years of experience with gis tools such as esri arcgis for situational awareness visualization use cases design and development + three or more years of experience with data preparation data mining using large (big data) structured and unstructured datasets **desired qualifications**+ master?s degree in data science computer science + three or more years of experience of data preparation data mining using hadoop and or sap hana technologies+ certification expertise in at least one of:+ data science and or+ predictive analytics+ geo-spatial visualization tools e g esri arcgis+ professional experience with the use of analytics and visualization toolsets ? specifically sas enterprise miner sas va power bi sap lumira sap predictive analytics+ experience with etl tools (eg talend sap data services oracle goldengate etc )+ five or more years of experience delivering business intelligence solutions in a regulated utility environment **comments**+ relocation may apply to this position + candidates for this position must be legally authorized to work directly as employees for any employer in the united states without visa sponsorship southern california edison an edison international (nyse:eix) company serves a population of nearly 14 million via 5 million customer accounts in a 50 000-square-mile service area within central coastal and southern california join the utility leader that is safely delivering reliable affordable electricity to our customers for over 125 years _if you require special assistance or accommodation while seeking employment with edison international please call human resources at (800) 500-4723 and choose option 3 for the employee information center representatives are available monday through friday 8 a m to 4 p m pacific time except wednesdays when the center closes at 2:30 p m and holidays or (800) 352-8580 (telecommunications device for the hearing impaired - tty) _southern california edison is an affirmative action and equal opportunity employer minority female disabled veteran gender identity sexual orientation **job** _information technology_**title:** _senior analytics architect & data engineer (its4)_**location:** _us-ca-rosemead_**requisition id:** _71019518_ visa candidates of any classification cannot be considered!senior big data architectlocation - philadelphia pennsylvaniarelocation paid: yes will relocate nationwidesalary: great baseemployment type: full time w2 direct hiredegree required: university - master's degree why is this a great opportunity?excellent opportunity to grow into big data practice top salary for experience + joining bonus roles & responsibilities:develop a variety of user-oriented applications to deliver analytic content in intuitive and innovative ways through applications and web interfaceswork in cross-disciplinary teams with the industry experts to understand the needs and analytics use cases for leading corporations and organizationsdevelop build test and deploy applications using iterative and agile-like development processestranslate advanced business analytics problems into technical approaches that yield actionable recommendations in diverse domains such as risk management product development marketing research supply chain and public policycommunicate results and educate others through insightful visualizations presentations and demonstrationsqualifications:must have big data experience hadoop is required bachelor’s degree from an accredited college university in computer science computer engineering engineering or related fields with five years of relevant experience master’s degree with two years of experience or phd with one year of experiencefluency in several programming languages such as python c# ruby java or javascript with the ability to pick up new languages and technologies quicklystrong written and verbal communication skills ability to work in dynamic team environments and multi-task effectivelyproficiency in unix linux environments and ability to develop in terminal environmentsproficiency in web-front end back-end net or linux-based development including planning build testing and deployment leverage extensive knowledge of big data systems process flows and procedures to aidanalyses and recommendations for solution offerings; lead feasibility analysis select the technologies that provide the best solution and identifythe products available that will best fit the solution proposed; design the solution taking advantage of the existing assets maintaining a balance betweenarchitecture requirements and specific client needs; create prototypes conduct proofs of concepts evaluate options with pros and cons andprovide recommendations; assess the business situation by engaging with varied levels and profile of stakeholders; facilitate project execution by providing support in project-level issue resolution and scopemanagement ensure quality of all project deliverables; collaborate with other team members (involved in the requirements gathering testing roll-out and operations phases) to ensure seamless transitions qualifications: bachelor’s master’s degree in computer science or related field from a top university; mbais desirable; at least 2-4 years of professional experience in it it services or consulting work; at least 2 years of hands on development experience using open source big datacomponents like: hadoop flume hive spark sqoop pig hbase aws kafka impala cloudera cassandra; programming experience in: sql java python r scala; proven ability to lead both on-shore and off-shore teams in the development of big datasolutions; deep expertise in the principals and architecture of various technologies within the big datastack is required; experience implementing 2-3 end-to- end big data cloud (amazon web services or other)solutions across multiple technologies and platforms; ability to communicate clearly and effectively in interpersonal and written formats; familiarity with agile development and devops best practices preferred; knowledge of it strategy and management desirable data visual developerirving txapply with system onetype:contractcategory:information technologyjob id:125915date posted:02 25 2018system one has an immediate opening for a data visual engineer with a client located in ft worth tx this is a long-term contract slated to go to the end of 2018 at this time we are unable to submit 3rd party candidates or those currently requiring visa sponsorship ideal candidates will have either r or python programming experience along with data mining big data concept and tool experience any experience with alteryx data blending and tableau data visualization are strongly preferred if interested please email matt mcdill senior recruiter at matt mcdill@systemone com general job descriptionour client currently has a data engineer data visual developer opening this position is geared to lead in development of new data mining methods analytics and tools many assignments will involve the use of statistical simulation tools (r alteryx) programming languages such as python vba java and c# and data visualization tools such as tableau job qualificationsminimum qualifications – education & prior job experience• bs in engineering data science operations research computer science mathematics related fields or equivalent experience training• demonstrated aptitude for logical analysis problem identification and problem solving as well as excellent interpersonal and communication skills• ability to use computer-based tools and programming languages to implement solutions• experience in programming languages such as r python sql html java script c#preferred qualifications – education & prior job experience• advanced degree (ms) in engineering data science operations research computer science mathematics• 2+ years’ experience in a technical professional environment• preference will be given to candidates with demonstrated experience in one or more of the following areas:• r or python programming• alteryx data blending and tableau data visualization• data mining big data concepts and tools• knowledge of classification statistical and machine learning algorithms for predictive and prescriptive analytics • ability to wrangle data from disparate sources into a comprehensive view of a business problem prior experience with recommender systems and or text mining a plus • ability to work very well with an operationally driven business unit in an agile fashion skills licenses & certifications• advanced knowledge of programing languages r python sql html java script c#• advanced knowledge of tableau data visualization and alteryx data blendinglanguage & communication skills• effective verbal and written communication skills• ability to effectively interact with employees at all levels within the organization including senior management apply with system one the data architect engineer will lead design and architect solutions addressing complex investment data integration process re-engineering quantitative data science predictive analytics and automation this positon will need to leverage their deep industry expertise as well as their skills in analysis tools data structure design data modeling and algorithm design to ‘connect the dots’ to identify the best sources for data and optimal flow and delivery of data the position must apply effective data cleansing rules and implement consistent algorithms to transform a complex data architecture into a cohesive data consumer usable environment the primary focus of this role will be to evaluate the current and planned data environment to identify opportunities for improvements including data source corrections reconciliation & compare opportunities and etl enhancements this position will work closely with technology data management and project team partners to understand the existing issues as well as future needs the role will also map and plan the data needs and flows for new initiatives to help ensure that data processing is highly efficient preferred data sources are used and data integrity expectations are met • over 5 years of experience in a data-related role • expertise conducting troubleshooting analysis creating use case scenarios and creating and running thorough test cases • strong analytical skills with the ability to collect organize analyze and distribute information • expertise in creating scripts queries and pulling data through standard tools including sql server and oracle • a solid enterprise or end-to-end understanding of the business processes applications and data flow associated with investment management strategies across multiple asset types • extensive knowledge and experience in database architecture standards and techniques including data models data transformation processes and etl tools • a thorough knowledge of data reconciliation and compare processes and tools • ability to communicate issues status and progress on assigned tasks to partners and stakeholders each with varied level of technical expertise additionally the following skills would be greatly desired: • familiarity with reporting and visualization tools such as tableau • familiarity with investment management applications including charles river eagle pace blackrock aladdin msci and factset • familiarity with markit edm - provided by dice trade order management portfolio management sql oracle what we doat goldman sachs our engineers don’t just make things – we make things possible change the world by connecting people and capital with ideas solve the most challenging and pressing engineering problems for our clients join our engineering teams that build massively scalable software and systems architect low latency infrastructure solutions proactively guard against cyber threats and leverage machine learning alongside financial engineering to continuously turn data into action create new businesses transform finance and explore a world of opportunity at the speed of markets engineering which is comprised of our technology division and global strategists groups is at the critical center of our business and our dynamic environment requires innovative strategic thinking and immediate real solutions want to push the limit of digital possibilities? start here who we look forgoldman sachs engineers are innovators and problem-solvers building solutions in risk management big data mobile and more we look for creative collaborators who evolve adapt to change and thrive in a fast-paced global environment the operations division is the engine room that powers goldman sachs and operations technology is driving their industrialization through automation digitization and orchestrationthe derivative data engineering team develops and enhances critical components of the post-execution derivatives trade processing platform we undertake a key role in the architecture of the derivatives platform and our success contributes directly to the efficient functioning of the firm's trading and sales operations on a day-to-day basis the team are responsible for the development of a model driven highly scalable high performance data storage and distribution framework this provides low latency access for online processing and leverages big data technologies for analytic processing we process 10's of millions of events every day and 1000's of requests per second at peak times hence we have a focus on delivering a horizontally scalable and resilient platform the team is passionate about solving complex technical problems delivering innovative resilient and performant systems as a member of our team you will work closely with business and technical colleagues locally and in london warsaw and bengaluru to solve challenging technical problems and deliver cutting-edge technical solutions to meet evolving business demands effective problem-solving consensus building and a highly-developed analytical and technical skill set are essential to your success we run an agile (scrum) team with iterative feedback loops with our clients and continual improvement of the team’s processes we are looking for talented engineers who have a thirst for learning and are passionate about building innovative solutions job summary• participate in the full software development lifecycle for systems written with java sql restful web services jms messaging search and big data solutions • design develop test and deliver highly performant and resilient client and server-side systems in an agile and fast-moving environment • form strong client relationships with business and other technology groups • through continuous learning continue to contribute new ideas and opinions to the group basic qualifications• bachelor’s or master’s degree in computer science or a related field• 3+ years commercial software development experience with java ideally java 8 • experience of an enterprise dbms (ideally sybase or db2)• excellent communication skills• ability to multi-task and prioritise work effectively• motivated to deliver outstanding technology solutions in a dynamic business environment• strong sense of ownership and driven to manage tasks to completionpreferred qualifications• knowledge of open source frameworks (spring spark)• experience with elasticsearch and kibana• experience with technologies in the hadoop stack• experience of messaging systems such as tibco ems or apache kafka• experience working in a financial services firmthe goldman sachs group inc is a leading global investment banking securities and investment management firm that provides a wide range of financial services to a substantial and diversified client base that includes corporations financial institutions governments and individuals founded in 1869 the firm is headquartered in new york and maintains offices in all major financial centers around the world the goldman sachs group inc 2018 all rights reserved goldman sachs is an equal employment affirmative action employer female minority disability vet the allen institute for brain science and brain initiative collaborators will be generating many petabytes over the next five years towards study of the role & function of cell types in the brain we are launching a new engineering initiative to build data pipelines databases and portals to ingest analyze and index the data which will be published through web portals and search apis we are looking for a software engineer to work with the institute and consortium leaders to build the databases which support this project the ideal candidate will have experience building databases and etl elt pipelines to support searching and analysis of data from many different sources as an early member of this engineering team this candidate will be responsible for building prototypes and implementation production ready systems including building databases and apis writing readable code with tests building deploying and maintaining the systems they build the engineer will also have to decide when to reuse existing software systems within the institute maintaining and refactoring these systems when necessary primary job responsibilitiescollaborate with the institute and consortium leaders and software engineers in other groups to build the cell census databases and services developing tuning and troubleshooting systems to ingest and transform data from multiple sources provide accurate estimates on the time and tradeoffs in implementing designs along with the associated risks help maintain our existing lab information management system (lims) and data warehouse advise on the choice or database technologies and best practices basic qualificationsbachelor’s degree in computer science or engineering or related discipline 6 – 9 years of relevant experience on a software development team expertise in design build and maintenance of relational databases (postgresql mysql sql server oracle) experience designing apis to provide access to large federated datasets knowledge of data warehousing and data pipeline concepts (star schemas dimensional models etl streaming) working knowledge of nosql databases used for indexing and searching complex databases (gremlin neo4j elastic search etc ) hands on experience with large scale data analysis solutions (kafka hadoop spark) 6+ years of experience with at least one general purpose object oriented language (python c++ java c#) experience developing testable code (unit testing dependency injection mocking etc ) experience with modern source control (git mercurial) desirable qualificationsadvanced degree in computer science or engineering or related discipline experience setting up and running a continuous integration continuous delivery system experience with ruby on rails with database applications and apisfamiliarity with a modern functional programing language e g scala or closure **please note this opportunity does not sponsor work visas and has no relocation assistance**it is the policy of the allen institute to provide equal employment opportunity (eeo) to all persons regardless of age color national origin citizenship status physical or mental disability race religion creed gender sex sexual orientation gender identity and or expression genetic information marital status status with regard to public assistance veteran status or any other characteristic protected by federal state or local law in addition the allen institute will provide reasonable accommodations for qualified individuals with disabilities at perficient you’ll deliver mission-critical technology and business solutions to fortune 500 companies and some of the most recognized brands on the planet and you’ll do it with cutting-edge technologies thanks to our close partnerships with the world’s biggest vendors our network of offices across north america as well as locations in india and china will give you the opportunity to spread your wings too we’re proud to be publicly recognized as a “top workplace” year after year this is due in no small part to our entrepreneurial attitude and collaborative spirit that sets us apart and keeps our colleagues impassioned driven and fulfilled perficient currently has a career opportunity for a bi & big data technical architect in our data solutions office located in dallas houston job overview:as a bi architect you will be filling the role of a strategic bi lead for the data solutions practice that focuses on analytics within perficient for our bi practice we are looking for someone who has the necessary technical expertise as well as adept understanding to for strategic business decisions work within this type of role requires you have a strong understanding of report development lifecycles including the understanding of requirements gathering designing and developing final solutions in addition we are looking for someone with a strong background in agile project management story-telling prototyping and sprint deployments as part of the data team you will be engaged on a multitude of projects that require a certain level of expertise in having worked or worked with data analysts and qa roles excellent communication skills to interact with clients and lead developers providing business support for deployed bi applications and developing documentation when needed serve as a technical lead and mentor provide technical support or leadership in the development and continual improvement of service develop and maintain effective working relationships with team members demonstrate the ability to adapt and work with team members of various experience level become the power bi & azure expert working on multiple facets of azure cortana intelligence suiteunderstand and design cube based bi models using ssrs asable to work in a data warehousing environment with multiple datasets to integraterequired qualifications: 7+ years strong experience in report development using the power bi tableau and qlik10+ years of creating sql queries to provide ad-hoc reports analysis and datasets based on business needs5+ years of experience in an edw environment with a preference in the last 2 years on big data distribution (on prem or cloud)1+ years of solid understand of the dax model and ssas cubes5+ years of consulting experience with client facing roles in bi5+ years of leading bi projects with multiple technologies5+ years of microsoft bi tools including excel and ssis rs as2+ years of big data bi experience5+ years of experience in define develop and maintain reports in multiple reporting platforms including tableau business objects cognos and ssrs power bi with a focus on boat least 2+ years of experience on working with large projects in the cloud (azure)assist with testing and quality assurance activities with data extracts research and analysisability to learn and develop skills as needed to support services outside of this job description and prove with prior examplesparticipate and lead in design sessions demos and prototype sessions testing and training workshops with business users and other it associates experience in building complex ms excel solutions using pivot tables and other reporting toolsexperience collecting analyzing and documenting user requirements and closely working with ba on prototypingexperience in building and executing production migration plans2+ years of prototyping with an ability to prove a solution in the intervieworacle and sql experience (11gr2 sql server 2005 2012) and pl sql and t-sql (8+ years) preferred qualifications: working knowledge of etl tools like informatica datastage etc (2+ years) experience solving a complex bi problem with a bi tool and ability to articulate (2+ years)ability to work as a collaborative team mentoring and training the junior team members working knowledge on building and reporting to data governance teamexperience working with huge bi practices such as top 10 large consulting firmspassion to provide a solid ux to non-technical userserwin power designer experience with dimensional modeling (5+ years)experience with jira hp-alm (2+ years) preferred education skills: preferred master’s degree in mis computer science data analytics or equivalentbachelor’s degree with a minimum of 5+ years relevant experience or equivalent perficient full-time employees receive complete and competitive benefits we offer a collaborative work environment competitive compensation generous work life opportunities and an outstanding benefits package that includes paid time off plus holidays in addition all colleagues are eligible for a number of rewards and recognition programs including billable bonus opportunities encouraging a healthy work life balance and providing our colleagues great benefits are just part of what makes perficient a great place to work more about perficient perficient is the leading digital transformation consulting firm serving global 2000 and enterprise customers throughout north america with unparalleled information technology management consulting and creative capabilities perficient and its perficient digital agency deliver vision execution and value with outstanding digital experience business optimization and industry solutions our work enables clients to improve productivity and competitiveness; grow and strengthen relationships with customers suppliers and partners; and reduce costs perficient's professionals serve clients from a network of offices across north america and offshore locations in india and china traded on the nasdaq global select market perficient is a member of the russell 2000 index and the s&p smallcap 600 index perficient is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race color religion sex national origin disability status protected veteran status or any other characteristic protected by law disclaimer: the above statements are not intended to be a complete statement of job content rather to act as a guide to the essential functions performed by the employee assigned to this classification management retains the discretion to add or change the duties of the position at any time comcast brings together the best in media and technology\ we drive innovation to create the world's best entertainment and online experiences\ as a fortune 50 leader we set the pace in a variety of innovative and fascinating businesses and create career opportunities across a wide range of locations and disciplines\ we are at the forefront of change and move at an amazing pace thanks to our remarkable people who bring cutting\-edge products and services to life for millions of customers every day\ if you share in our passion for teamwork our vision to revolutionize industries and our goal to lead the future in media and technology we want you to fast\-forward your career at comcast\ **data engineering and data science skills combined with the demands of a high volume highly\-visible analytics platform make this an exciting challenge for the right candidate\ **are you passionate about digital media entertainment and software services? do you like big challenges and working within a highly\-motivated team environment?as a senior data engineer in comcast _dx_ you will research develop support and deploy solutions using real\-time distributing computing architectures\ you will also employ your skills to deliver insights into customer and network behavior on a rapidly\-growing video\-over\-ip platform\ the dx data engineering team is a fast\-moving team of world\-class experts who are innovating in end\-to\-end video delivery\ we are a team that thrives on big challenges results quality and agility\ **who does the data engineer work with?**data engineering is a diverse collection of professionals who work with a variety of teams ranging from other engineering teams whose software integrates with analytics services service delivery engineers who provide support for our products testers operational stakeholders with all manner of information needs and executives who rely on data to make decisions\ **what are some interesting problems you'll be working on?**develop systems capable of processing millions of events per second and multi\-billions of events per day providing both a real time and historical view into the operation of our wide\-array of systems\ design collection and enrichment system components for quality timeliness scale and reliability\ work on high performance real\-time data stores and a massive historical data store using best\-of\-breed and industry leading technology\ design develop and apply advanced statistical methods and machine intelligence algorithms\ **where can you make an impact?**comcast _dx_ is building the core components needed to drive the next generation of data platforms and data processing capability\ building data products identifying trouble spots and optimizing the overall user experience is a challenge that can only be met with a robust data architecture capable of providing insights that would otherwise be drowned in an ocean of data\ success in this role is best enabled by a broad mix of skills and interests ranging from traditional distributed systems software engineering prowess to the multidisciplinary field of data science\ **responsibilities:**+ lead development for new products+ analyze massive amounts of data both in real\-time and batch processing+ prototype ideas for new tools products and services+ employ rigorous continuous delivery practices managed under an agile software development approach+ raise the bar for the engineering team by advocating leading edge practices such as ci cd containerization and tdd+ enhance our devops practices to deploy and operate our systems+ automate and streamline our operations and processes+ build and maintain tools for deployment monitoring and operations+ troubleshoot and resolve issues in our development test and production environments**here are some of the specific technologies we use:**+ spark streaming and batch+ kafka aws kinesis+ avro parquet+ memsql cassandra hbase mongodb+ redis+ java scala go+ git maven jenkins+ rancher puppet docker kubernetes+ linux+ hadoop \(hdfs yarn\)**skills & requirements:**+ 5 years programming experience+ bachelors or masters in computer science statistics or related discipline+ experience in software development of large\-scale distributed systems including proven track record of delivering backend systems that participate in a complex ecosystem+ experience in data related technologies and open source frameworks preferred+ proficient in unix linux environments+ test\-driven development test automation continuous integration and deployment automation+ enjoy working with data analysis data quality and reporting+ excellent communicator able to analyze and clearly articulate complex issues and technologies understandably and engagingly+ great design and problem solving skills+ adaptable proactive and willing to take ownership+ keen attention to detail and high level of commitment+ thrives in a fast\-paced agile environment\ requirements change quickly and our team needs to constantly adapt to moving targets**about comcast** _dx_ **:****comcast** _dx_ is a result driven engineering team responsible for the delivery of multi\-tenant data infrastructure and platforms necessary to support our data\-driven culture and organization\ _dx_ has an overarching objective to gather organize and make sense of comcast data with intention to reveal business and operational insight discover actionable intelligence enable experimentation empower users and delight our stakeholders\ members of the _dx_ team define and leverage industry best practices work on extremely large scale data problems design and develop resilient and highly robust distributed data organizing and processing systems and pipelines as well as research engineer and apply data science and machine intelligence disciplines\ comcast is an eoe veterans disabled lgbt employer position summarybig data engineers serve as the backbone of the strategic analytics organization ensuring both the reliability and applicability of the team's data products to the entire organization they have extensive experience with etl design coding and testing patterns as well as engineering software platforms and large-scale data infrastructures big data engineers have the capability to architect highly scalable end-to-end pipeline using different open source tools including building and operationalizing high-performance algorithms big data engineers understand how to apply technologies to solve big data problems with expert knowledge in programming languages like java python linux php hive impala and spark extensive experience working with both 1) big data platforms and 2) real-time streaming deliver of data is essential big data engineers implement complex big data projects with a focus on collecting parsing managing analyzing and visualizing large sets of data to turn information into actionable deliverables across customer-facing platforms they have a strong aptitude to decide on the needed hardware and software design and can guide the development of such designs through both proof of concepts and complete implementations additional qualifications should include:tune hadoop solutions to improve performance and end-user experience proficient in designing efficient and robust data workflows documenting requirements as well as resolve conflicts or ambiguities experience working in teams and collaborate with others to clarify requirements strong co-ordination and project management skills to handle complex projects excellent oral and written communication skillsjob responsibilities big data engineer:responsibilities include:§ translate complex functional and technical requirements into detailed design § design for now and future success§ hadoop technical development and implementation § loading from disparate data sets by leveraging various big data technology e g kafka§ pre-processing using hive impala spark and pig§ design and implement data modeling§ maintain security and data privacy in an environment secured using kerberos and ldap§ high-speed querying using in-memory technologies such as spark § following and contributing best engineering practice for source control release management deployment etc§ production support job scheduling monitoring etl data quality data freshness reportingskills required:§ 5-8 years of python or java j2ee development experience§ 3+ years of demonstrated technical proficiency with hadoop and big data projects§ 5-8 years of demonstrated experience and success in data modeling§ fluent in writing shell scripts [bash korn]§ writing high-performance reliable and maintainablecode § ability to write mapreduce jobs§ ability to setup maintain and implement kafka topics and processes§ understanding and implementation of flume processes§ good knowledge of database structures theories principles and practices § understand how to develop code in an environment secured using a local kdc and openldap § familiarity with and implementation knowledge of loading data using sqoop § knowledge and ability to implement workflow schedulers within oozie§ experience working with aws components [ec2 s3 sns sqs]§ analytical and problem solving skills applied to big data domain§ proven understanding and hands on experience with hadoop hive pig impala and spark§ good aptitude in multi-threading and concurrency concepts § b s or m s in computer science or engineering evo is seeking a big data engineer for our beaverton client this contract opportunity is scheduled to be 1-year skills and requirements:ms bs in computer science or related technical disciplinestrong programming experience scala preferredexperience working with big data streaming services such as kinesis kafka etc experience working with big data streaming frameworks such as nifi spark-streaming flink etc experience working with nosql data stores such as hbase dynamodb etc experience building domain-driven microservicesexperience provisioning restful api's to enable real-time data consumptionexperience with performance and scalability tuningdesire to work collaboratively with your teammates to come up with the best solution to a problemdemonstrated experience and ability to deliver results on multiple projects in a fast-paced agile environmentexcellent problem-solving and interpersonal communication skillsstrong desire to learn and share knowledge with otherspreferred skills:experience in python or javaexperience working with hadoop and big data processing frameworks such as spark hive etc experience with sql and sql analytical functionsexperience working in a public cloud environment particularly awsfamiliarity with practices like continuous development continuous integration and automated testingfamiliarity with build tools such as cloud formation and automation tools such as jenkinsagile scrum application development experiencean interest in artificial intelligence and machine learningnot a fit for you but know someone that might be? refer them! we have a great referral program where you can earn up to $375 per referral find out more at www evosolutions com refer applicants must be fully authorized to work in the u s and physically be in the u s corp-to-corp requests will not be entertained relocation assistance will not be available for this position evo is an equal opportunity employer and considers qualified applicants for employment without regard to race gender age color religion disability veteran status sexual orientation gender identity or any other protected factor everyone at our company from our entry-level consultants to our principals interacts with clients in several ways examples often include field inspections marketing presentations industry conferences and project reporting the responsibility for client interaction management and development increases as you progress along our consulting career path building client relationships starts right away our engineers scientists and senior engineer scientists often participate in field inspections teleconferences or face-to-face client meetings working with our established consultants they receive hands-on experience and begin the process of building these relationships by doing excellent work on time and within budget many of our entry-level hires have found themselves on an inspection during their first week at our company!requirements:1 candidates must have hands-on experienced with golden gate skills including development lifecycle testing methodology version control and problem management 2 candidates must have current practical experience with golden gate in the production environment3 must have extensive change data control limited etl design build and rdbms exposure4 advanced debugging and trouble-shooting experience5 current practical experience in configuration administrative scripting supporting and monitoring with golden gate6 must have worked experience with unix and understanding moderate complex unix shell scripts7 effective verbal and written communication skills self-motivated high level of initiative and team player with the ability of multitasking is a big plus all qualified applicants will receive consideration for employment without regard to race color religion sex or national origin as our consultants progress into manager and senior manager roles they are responsible for managing key client relationships and developing business for themselves and others they market our multidisciplinary services to both existing our company clients and potential new ones our principals are often working with our largest clients and managing our most significant and challenging projects **enterprise data analytics services data manager ires \- safb****description**applies knowledge and know\-how in data science to applications pertinent tosubjectmatter experts at mda\ strong interpersonal skills and writing ability commensurate with programmingskills mathematicalskillsandstatisticalbackground\ datascientist asasubjectmatterexpertinthefield collaborates withbusinessunitsandend\-userstocreatecompellingandaccuratedatapresentationsbasedontheentireedasservicesuite:design prototype dataingestion cleaning curation storageandretrieval concentratingparticularly on analyticsupport statisticalmodeling support disseminationsupportand presentationsupport\ the datascientistwill work with developers and data analysts from edas to build and maintain anend\-to\-enduserexperienceandinterfacelifecyclebasedonbusinessfunctionalandperformancerequirements\ **qualifications**generalexperiencemustinclude14yearsofrelatedexperienceand12 yearsindatasciencedisciplines including mathematics physical orsocialsciences statisticsor related discipline and be knowledgeable in programminginoneormoreofthefollowing:java c python r c scala lisp mathematica matlaborruby\ bs degree from an accredited college or universitymust be a us citizenmust have a current active dod secret clearancecurrent incumbents will have prioritydesired:the developer should have one relevant industry certification at the professional level\ jacobs is an equal opportunity affirmative action employer\ all qualified applicants will receive consideration for employment without regard to race color religion sex sexual orientation gender identity national origin disability veteran status or other characteristics protected by law\ jacobs is a background screening drug\-free workplace\ **primary location** united states\-colorado\-schriever afb**travel** yes 25 % of the time**req id:** as0002vo unizin is looking for an experienced data engineer to be responsible for expanding and optimizing our data and data pipeline architecture the ideal candidate is an experienced data pipeline builder and data wrangler who enjoys streamlining data systems and building them from the ground up the right candidate will be excited by the prospect of optimizing or even re-designing our company's data architecture to support our next generation of products and data initiatives responsibilities create and maintain optimal data pipeline architecture identify design and implement internal process improvements: automating manual processes optimizing data delivery re-designing infrastructure for greater scalability etc build the infrastructure required for optimal extraction transformation and loading of data from a wide variety of data sources using sql and cloud platform (think aws or google) ‘big data' technologies build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition operational efficiency and other key business performance metrics qualificationsadvanced working sql knowledge and experience working with relational databases query authoring (sql) as well as working familiarity with a variety of databases experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement strong analytic skills related to working with unstructured datasets build processes supporting data transformation data structures metadata dependency and workload management working knowledge of message queuing stream processing and highly scalable ‘big data' data stores we are looking for a candidate with substantial experience in a data engineer role who has attained a degree in a computer science or related field they should also have experience using the following software tools:big data tools such as hadoop and sparkrelational sql and nosql databases such as postgres and bigquery data pipeline and workflow management tools such as airflowgoogle (preferred) or amazon cloud serviceslanguages like python java c++ scala etc benefitsgreat teamcool technologymission of high social value*competitive salary excellent medical dental and optical plans403b with matchingopen ptofree parkingloaded kitchenflexible hoursabout unizinunizin is a non-profit consortium of research universities that is working together to improve digital teaching and learning in higher education we collaborate to advance the understanding and practices of content and data in the context of learning our members use shared cloud platforms to manage affordable content delivery digital learning experiences and teaching and learning data flows to create a unified view of the student and support learning science research today more than 1 million students are served by our members and will be impacted by the services we offer these universities are using their combined mission of education and research to close the loop between learning science and pedagogy research to drive advancements in teaching and learning is breaking new ground for higher education and you can be a part of that we are fully funded by the investments of our membership for a sustainable mission to enable the next generation digital learning environments and ecosystem we are a startup inside a collaborative of substantial scale and resources visit us at www unizin org data center architect- remotethis role will support a project for migration of 170 client applications to company domain infrastructure as part of a new service offering called remote infrastructure management - converged infrastructure services this is a solutions architect role that requires broad experience and knowledge in both applications and infrastructure (network firewall hardware etc ) architecture applications exist across multiple technology platforms but primarily ms products must understand how applications behave under various conditions must understand and plan for application future state in the new environment will take client requirements and lead the ba team is building documenting and executing migration plans deep sql expertise is a must required:twelve or more years of systems development experience with five or more years as an engineer or systems specialist deep knowledge understanding of sdlc and application architecturedeep knowledge understanding of network infrastructure architecturedemonstrated experience in previous application migration projects initiativesexperience with applications hosted on converged or hyper-converged ‎ infrastructure strongly preferred100% remote- must be based in usbachelor' s degree in computer science or related field preferred kforce has a client in search of a data architect in waltham massachusetts (ma) essential job functions:* determine data requirements and structure by analyzing existing business systems and process* ensures flexible data models for future business transformation activities* maintain database performance* create migration plans process flows and dependency diagrams as business needs arise* partner with cross functional and 3rd party vendor teams* bachelor's degree in computer science information science or similar fields* proven experience as a data architect data scientist or similar role* experience with service oriented architecture (soa) web services enterprise data management information security applications development and cloud-based architectures* experience with informatica products especially the data management suite* experience with cloud hosting environment such as aws or azure* knowledge of data mining and segmentation techniques* in-depth understanding of database structure principles* familiarity with salesforce sales logix netsuite or oracle db* familiarity with data visualization tools (e g tableau d3 js and r)* familiarity with boomi pentaho or informatica etl* ability to independently create blueprints as well as data models from requirements or independent systems forensics* appreciation of agile development environment* passion for technology and self-education to recommend new technologies and techniqueskforce is an equal opportunity affirmative action employer all qualified applicants will receive consideration for employment without regard to race color religion sex pregnancy sexual orientation gender identity national origin age protected veteran status or disability status *compensation type:*hours pros2plan a spinnaker company is now hiring for a sr supply chain data governance consultantlocation: anywhere in us – headquarter in houston tx (willing to travel)duration: full timesalary: openposition descriptiondata governance and reporting lead on supply chain consulting engagements this position will partner with business and it users to document definitions for data and calculations and to define data quality metrics and processes develop business requirements documents functional specifications and reports as needed for data defect remediation and to support process change engagements will likely include:assess a client’s current master data architecture and identify areas of risk assess a client’s current master data maintenance processes and identify potential areas of risk document data maintenance processes data design and points of control supply chain metadata and orphan data identify potential points of data integrity risk and define audits todirect data defect remediation of enterprise system data which feed a supply chain implementation develop workflow processes for data entry and governance of supply chain facilitate process redesign efforts recommend infrastructure changes to deliver operational and data governance reports specific to a client’s supply chain systems position requirementsa minimum of a bachelor’s in supply chain mis or equivalent strong problem-solving skills 7+ years of overall experience including it mdm data governance consulting and or it audit3+ years working with complex data structuresstrong sql data extraction skills requiredexperience with data transformation tools (alteryx etl etc )advanced excel skills spreadsheet analysis functions and chartingknowledge exposure to supply chain solutionsknowledge of sap erp data structures a plusexperience using data visualization tools (tableau power bi qlikview)ability to explain complex issues to non-technical business users ability to interact with all levels of the business (analyst managers executives)proven ability to navigate complex organizational politics and motivationsexperience managing customer escalationsability to think strategicallymust be able to work independently with minimal direction ability to adapt to shifting priorities and multitasking to support client priorities exceptional analytic and critical thinking skills writing skills communication skills consulting skills and ability to work within a team ability to consider others’ ideas seriously and accept feedbackable to work in the u s work from home and travel domestic or internationally up to 100% transform health care and change the way consumers engage with technology sounds like a big challenge right? here at optum you have the opportunity to achieve great things while you showcase your passion and technical expertise as a big data developer you'll be inspired by working alongside some of the most brilliant minds in technology who bring compassion energy and focus to their work every day on a mission to change health care as we know it join us! come share who you are and start doing your life's best work (sm)as part of our development team you will predominantly be involved in building business solutions by creating new and modifying existing software applications you'll stretch your skills and grow your careers as a primary contributor in designing coding testing debugging documenting and supporting all types of applications consistent with the established specifications and business requirements in order to deliver business value you’ll enjoy the flexibility to telecommute* from anywhere within the u s as you take on some tough challenges primary responsibilities: develop innovative data management and analytic solutions to meet client requirements on a big data platform lead large complex projects to achieve key business objectives translate highly complex concepts in ways that can be understood by a variety of audiences perform complex data analysis to support research requests by the client follow quality assurance guidelines including the documentation review and approval of all project related artifacts troubleshoot client and operational issues quickly and comprehensively lead analytic development projects of varying complexity design develop document and architect hadoop applications develop mapreduce coding that works seamlessly on hadoop clusters seamlessly convert hard-to-grasp technical requirements into outstanding designs required qualifications: undergraduate degree or equivalent experience 3+ years working on a big data environment knowledge of data loading tools like flume sqoop 3+ years data analysis experience 2+ years oracle database experience 2+ years testing writing data analytics or business intelligence reports knowledge and experience with big data concepts and common components including hadoop spark and multiple languages (java scala python r) preferred qualifications: undergraduate degree in information systems computer science or data science healthcare data analytics experience (financial and metrics based reporting) it technical writing skills careers with optum here's the idea we built an entire organization around one giant objective; make the health system work better for everyone so when it comes to how we use the world's large accumulation of health-related information or guide health and lifestyle choices or manage pharmacy benefits for millions our first goal is to leap beyond the status quo and uncover new ways to serve optum part of the unitedhealth group family of businesses brings together some of the greatest minds and most advanced ideas on where health care has to go in order to reach its fullest potential for you that means working on high performance teams against sophisticated challenges that matter optum incredible ideas in one incredible company and a singular opportunity to do your life's best work (sm) *all telecommuters will be required to adhere to unitedhealth group’s telecommuter policy diversity creates a healthier atmosphere: unitedhealth group is an equal employment opportunity affirmative action employer and all qualified applicants will receive consideration for employment without regard to race color religion sex age national origin protected veteran status disability status sexual orientation gender identity or expression marital status genetic information or any other characteristic protected by law unitedhealth group is a drug-free workplace candidates are required to pass a drug test before beginning employment job keywords: big data developer flume sqoop data analysis database analytics telecommute telecommuter remote work from home 84db715c-dbd5-4fa9-afea-f52fe9437b97*big data developer - telecommute**minnesota-minnetonka**758585* position description: job title : senior data engineer - nsw odsjob type: full-timecompensation:commensurate with experience scope of work and locationlocation : coronado ca 92134reports to : program managermandatory: considering local candidates with us citizenship to support this government contract employer will not sponsor applicants for work visas for this position position requires an active dod secret clearance_______________________________________________________position summary:the senior data engineer contractor will have 5 or more years of data engineering experience including extract transform load (etl) data ingestion and distribution operations and experience in designing maintaining and providing support for operational data stores data marts data lakes and data warehouses the senior data engineer will ensure data accuracy availability integrity and confidentiality and evaluate new data systems and technologies for possible adoption and improvements to performance of existing applications experience and credentials :requires experience in data database data store data mart data lakes and data warehouse design and operations and business systems design and engineering requires skills in developing etl code and in modern programming languages including structured query language (sql) microsoft (ms) vb net ms c# net ms sql server ms sql server integration services (ssis) ms sql server reporting services (ssrs) ms asp net ms internet information server (iis) html javascript jquery css application life cycle management ms visual studio and team foundation services (tfs) microsoft sql server bi stack including sql server (2012 +) oracle golden gate and structured query language database servers has both the ability to work independently and as a team with data analysts scientists data engineers and other stakeholders requires a bachelor's or master's degree in a related information technology field prior experience working in the nsw community is highly desired must be motivated self-driven team player who can interact well with others and advise consult with other team members on related issuesmust have excellent written and verbal english communication skills good organizational skills and attention to detail will be required to undergo and pass applicable background checks in accordance with dod 8570 01; candidate must meet the information assurance technician (iat) level ii requirements by having one of the following certifications: gsec security+ ce sscp ccna-sec cissp cisa casp gcih gsec or gced)_______________________________________________________milvets offers an excellent benefits package including health insurance dental insurance life insurance disability insurance vision 401(k) paid time off applicants for u s based positions with milvets systems technology inc must be legally authorized to work in the united states verification of employment eligibility will be required at the time of hire visa sponsorship is not available for this position www dhs gov e-verify e-verify is a registered trademark of the u s department of homeland security this business uses e-verify in its hiring practices to achieve a lawful workforce equal employment opportunity all qualified applicants will receive consideration for employment without regard to race color religion sex pregnancy sexual orientation gender identity national origin age protected veteran status or disability status milvets systems technology inc is an equal employment opportunity affirmative action employer and maintains a drug-free workplace ## descriptioninvesco is a leading global asset management firm with more than $917 5b* in assets under management we provide our retail and institutional clients a diverse and comprehensive range of investment capabilities to help people get more out of life invesco is publicly traded on the new york stock exchange (ivz) and has about 7 000 employees in over 20 countries (*as of september 30 2017)**_job purpose (job summary)_:**participate actively in a review of invesco’s enterprise data taxonomy under direction of the metadata manager ensure that the data model of key data domains is accurate and describe the business needs for the data concerned **_key responsibilities duties_:*** establish an understanding of the business uses of data classified as key to several fundamental data domains * review existing enterprise data models and taxonomy for those data domains and propose changes to the taxonomy to make it more complete accurate and consistent * remove ambiguity and duplication within the data model and taxonomy * obtain approval for the proposed changes and implement them within invesco’s data taxonomy * maintain the correct linkages from the new taxonomy and the physical data used within invesco’s key systems * provide plans for the work involved and regular status updates to the metadata manager ## qualifications**_work experience knowledge_:*** working towards a degree in finance technology (incl computer science information science management information systems data science) or engineering* experience of data modelling concepts and tools an advantage **_skills other personal attributes required_:*** ability to understand abstract concepts and relate them to real world situations * ability to communicate well with different audiences with clarity and conciseness demonstrated by well-developed written and verbal skills and the ability to deliver content to moderate-sized groups* organizational skills necessary to meet deadlines prioritize tasks and process several streams of work simultaneously* self-driven and motivated with the ability to independently deliver on assignments* rich attention to detail to capture and communicate complex topics* ability to understand context and ensure detail is in line with broader objectives* working knowledge of microsoft office tools primarily word excel powerpoint* ability to utilize visio to diagram data or process flows**_working conditions_:*** normal office environment with little exposure to noise dust and temperatures * the ability to lift carry or otherwise move objects of up to 10 pounds is also necessary * normally works a regular schedule of hours however hours may vary depending upon the project or assignment **_flsa (us only)_:**nonexemptthe above information on this description has been designed to indicate the general nature and level of work performed by employees within this role it is not designed to contain or be interpreted as a comprehensive inventory of all duties responsibilities and qualifications required of employees assigned to this job the job holder may be required to perform other duties as deemed appropriate by their manager from time to time invesco's culture of inclusivity and its commitment to diversity in the workplace are demonstrated through our people practices we are proud to be an equal opportunity employer all qualified applicants will receive consideration for employment without regard to race creed color religion sex gender gender identity sexual orientation marital status national origin citizenship status disability age or veteran status our equal opportunity employment efforts comply with all applicable u s state and federal laws governing non-discrimination in employment ## job*job:* global information delivery services (gids) and other data management*primary location:* north america-united states-kentucky-louisville-400 west market street suite 3300*schedule:* part-time*req id:* 21054 seeking a sr data architect for a long term open-ended contract opportunity in torrance ca responsibilities:influence projects initiatives and drive decisions related to data including data quality data architecture and data management best practices ideal candidate will have project experience with data modeling using industry leading tools like erwin or erstudio strong understanding of modern data warehouse concepts and predictive cognitive analytics concepts exposure to cloud based data & analytics solutions would be a huge plus responsibilities include:· lead key business critical projects in the capacity of a data architect and advisor· define data architecture standards best practices and participate in governance activities· lead key data architecture & data management initiatives· review business requirements and technical design documents to develop effective data and database solutions· create maintain and define conceptual logical and physical data models for various manufacturing projects using relational theory and dimensional modeling align them to enterprise architecture· provide insight into logical business data requirement to work with dbas to support physical data model considerations· define maintain and adhere to enterprise modeling standards· develop etl specification and document data migration mappings and transformations for data warehouse loadingrequired qualifications:· 5 to 8 years of experience in data architecture principles methods techniques and technologies· experience with master data management and data governance· bachelor's degree or master's degree in computer science or applicable field of study· proven background of designing and implementing architectural solutions which solve strategic and tactical business needs · strong technical understanding of data modeling design and architecture principles and techniques across master data transaction data and derived analytic data · deep knowledge of best practices through relevant experience across data-related disciplines and technologies particularly for enterprise-wide data architectures data management data governance and data warehousing· highly competent with relational database design and structured query language (sql) · highly competent with data modeling using industry leading tools like erwin or erstudio· expert level understanding of modern data warehouse concepts· understanding of advance analytics needs such as predictive cognitive analytics concepts· exposure to cloud based data & analytics solutions about gemaire:as the leading edge of a national top 10 hvac wholesaler gemaire com is in the unique position of having a large existing business to serve as we scale our b2b business digitally gemaire com is an integral part of the longterm strategy of gemaire and is already contributing substantially to the business as a leader of the team you will be responsible for understanding business needs and direction and architecting testing developing and owning the technology behind our websites mobile applications and backend system integrations about the opportunity: the data developer engineer is a cross-functiona leader analyzing business systems requirements processes and system integration considerations to determine and develop appropriate technology solutions across pim (product information management) ecommerce and marketing designs configures codes tests and documents solutions based on system and user requirements using current programming language business systems interfaces and technologies implements the solution by configuring business systems or by writing custom code and performs testing and debugging of solution completes necessary documentation and procedures for installation and maintenance of the solution essential duties and responsibilities:understand and document the data within a large corporation that has decades of operational customer and product information data leverage this understanding to help the leadership team:profile customersimprove customer experiences – especially on the rapidly growing gemaire com websiteguide change within the companybe responsible for gathering understanding business requirements needs and translate them into scalable maintainable data transformation and management systems be a key contributor to the development and constant improvement of our digital presence while adapting to business priorities prepares flow charts business process management (bpm) workflows and systems diagrams to assist in problem analysis designs configures codes tests debugs and documents software solution according to sbd-it systems standards policies and procedures prepares test data for unit development integration and performance testing ensures adoption of and adherence to best practicesrequirements:educated: bachelor’s degree in computer science or computer engineering or equivalent work experience embrace new technology: gemaire is rapidly expanding its online presence and requires a leader who is constantly learning and adjustinglove to automate: this is a data movement organization process improvement and development role you will be documenting existing processes identifying opportunities to automate internally and externally crowdsource data and improve processes by building or implementing new tools writing code mentoring a team of data processing professionals and interacting with multiple apis every daysql and relational databases: as an operating distributor our and our vendor’s apis are responsible for reaching into many sql databases (db2 would be a plus) in the act of servicing requeststechnology skills:development methodology: agile scrum continuous integrationdata tools: python java scala perl php informaticaskills: sql parsers grammers data cleansing machine learning anomaly detection and analysisplatforms: aws rackspace linux ms windows as 400databases: mysql mssql postgresql db2measurement and analytics: google analytics rj metrics microstrategynice to haves: graph databases(e g neo4j)business skills:collaboration: gain understanding of business needs coordinate across multiple geographically separate teams seek siloed data within the corporation and ingest the data into a usable formatdata management: define dataflows data mappings and data governancecompensation perks & benefits:competitive compensation with bonus potential commensurate with experiencefull health benefits - medical dental vision401k paid time off and tuition reimbursementfast-paced work environmentwell-funded public parent company (nyse: wso) **about us**symantec corporation (nasdaq: symc) is the global leader in cyber security operating one of the world’s largest cyber intelligence networks we see more threats and protect more customers from the next generation of attacks we help companies governments and individuals secure their most important data wherever it lives we make the world a safer place by helping people businesses and governments protect and manage their information so they can focus on achieving their goals **sr enterprise architect data domain****job description**+ provides overall direction guidance and definition of the applications and data architecture to support the company’s business strategy + works with key business stakeholders to gain a strong understanding of the business objectives and related technology needs across the enterprise + builds and maintains reference architectures for key programs and ensures that solution architecture teams adhere to the right architectures + actively participates in architecture review board to ensure that the right architecture is achieved through collaboration with other architects + works with multiple service owners to ensure that the right solution architecture is achieved + very strong communication and influencing skills is a must as enterprise architects will frequently interface with key business and it leaders at every level **qualifications**+ experience with the creation and maintenance of e2e data strategies architecture and data flows between transaction systems and bi + demonstrated experience representing it in the data governance and influencing business and it stakeholders on architectural changes and strategies + familiarity and experience working with crm and erp data architectures + experience in mdm data warehouse bi and analytics and data science ai + bachelor’s degree in computer science or ms degree and any relevant certifications such as togaf would be an advantage + experience working with enterprise architecture frameworks such as togaf or zachman is required along with working knowledge of ea tools for modeling architecture + 10 or more years of professional it experience in large scale distributed environment with hands on experience as an enterprise architect for at least three years focused in data domain + proven expertise in enterprise architecture cutting across business processes applications data and infrastructure will be an advantage ====================​====================​symantec is an equal opportunity employer all candidates for employment will be considered without regard to race color religion sex gender identity sexual orientation national origin physical or mental disability veteran status or any other basis protected by applicable federal state or local law relentlessly protect the world’s information make a difference at symantec across the globe we are an ‘essential’ partner to both consumers and businesses of all sizes we combine our talents our brains and our creative energy to reinforce our place as a world-class technical community our most critical asset at symantec is the talent we hire - you! we look for people who have a desire to excel and reflect our values: innovation action customer-driven and trust we recognize that every opening in our company is a chance to increase symantec's competitive advantage and we are willing to invest in you in order to win symantec is an equal opportunity employer all candidates for employment will be considered without regard to race color religion sex national origin physical or mental disability veteran status or any other basis protected by applicable federal state or local law symantec will respond to requests for reasonable accommodations to assist you in applying for positions at symantec or to submit a resume if you need to request an accommodation please contact hr service exchange at https: symantec service-now com hrp eeo is the law applicants and employees of symantec corporation are protected under federal law from discrimination click here at http: www dol gov ofccp regs compliance posters pdf eeopost pdf to find out more eiworkflow solutions llc is a cloud software consulting firm based in albany ny eiworkflow solutions llc is currently looking for a consultant for the following role oracle data warehouse programmertasks the role will be performing:perform analysis design and development to maintain and enhance the current data warehouse environment; develop new and modify current (etl (extract transform load) processes using oracle warehouse builder; develop new enhance existing reports using oracle reports and oracle discoverer; create or modify the oracle discoverer environment; assist with design and planning for obiee environment; participate in the development and execution of tests plans; write sql sql stored procedures and pl sql packages as well as maintain modify existing code; assist in resolving reported data warehouse issues; create documentation for data warehouse structures components and processes and document all work in accordance with standards; meet and work with business users to define requirements; provide hands-on training and mentoring for other data warehouse staff requirements for the position:60 months experience creating and maintaining a data warehouse 36 months experience using oracle warehouse builder 11g 36 months experience writing high level sql statements and pl sql packages 36 months experience lead designer developer of data warehouses 24 months experience using oracle business intelligence enterprise edition 12 months experience using oracle discoverer oracle discoverer administrator 12 months experience using oracle reports 6 0 24 months experience in staff education and mentoring bachelor's degree in computer science or mathematics date first created: 03 11 2018 data architect as a data architect you will play a major role in the way we store our data as well as in developing our core back-end infrastructure and building integrations with our platform partners create and maintain data models across platform & partners own data models from conceptualization to database administration; that means you will dig into business requirements to ensure our models are robust maintainable and meet our needs as well as oversee the systems and interfaces we use to manage the data and to analyze the current state and conceive desired future statesset standards for data management including supporting in the creation of data management processes across all company departmentssupport integrations between our platform and leading banks and dealershipssupport all engineering groups in creating robust scalable consistent applicationsshare technical solutions and product ideas through design review pair programming and technological discussionswork seamlessly in an agile environment with engineers product managers business analysts and designers to understand end-user requirementsformulate use cases and implement pragmatic and effective technical solutionstroubleshoot and debug technical issuespitch in and support other functions (e g product) as needed as a data architect you will need:3 –5 years of experience in data architecture preferably in a start-up environmentsolid software development and dba background including database infrastructure deploymentability to conceive and portray the big data picture to all departments in the companyexperience working in an agile development environment and leading data architecture design for both operational data and data analyticsdeep knowledge of:sql databases (e g mysql)nosql databases & modern database frameworks (e g mongodb cassandra redshift)real-time data technologies (e g kinesis kafka)bachelor's or graduate degree in computer science or related field responsibilities include:translate complex functional and technical requirements into detailed design design for now and future successhadoop technical development and implementation loading from disparate data sets by leveraging various big data technology e g kafkapre-processing using hive impala spark and pigdesign and implement data modelingmaintain security and data privacy in an environment secured using kerberos and ldaphigh-speed querying using in-memory technologies such as spark following and contributing best engineering practice for source control release management deployment etcproduction support job scheduling monitoring etl data quality data freshness reportingskills required:5-8 years of python or java j2ee development experience3+ years of demonstrated technical proficiency with hadoop and big data projects5-8 years of demonstrated experience and success in data modelingfluent in writing shell scripts [bash korn]writing high-performance reliable and maintainablecode ability to write mapreduce jobsability to setup maintain and implement kafka topics and processesunderstanding and implementation of flume processesgood knowledge of database structures theories principles and practices understand how to develop code in an environment secured using a local kdc and openldap familiarity with and implementation knowledge of loading data using sqoop knowledge and ability to implement workflow schedulers within oozieexperience working with aws components [ec2 s3 sns sqs]analytical and problem solving skills applied to big data domainproven understanding and hands on experience with hadoop hive pig impala and spark good aptitude in multi-threading and concurrency concepts b s or m s in computer science or engineering **description**if you are someone who enjoys big data data analytics and designing innovative analytical solutions in a fast-paced work environment as well as working with the latest technologies like pyspark this job may be for you! this role as big data engineer is a leader in analytics for staples and will be responsible for defining supporting and evangelizing our hortonworks implementation across the enterprise you will work collaboratively in a consultative way across the business units supporting and mentoring their engineering teams advising on best approaches to leveraging big data in the most effective and efficient manner you will partner with staples digital solutions (technology) teams across the enterprise on a major modernization of data analytics platform initiative building the foundational components to drive the business you will also be responsible to develop relationships with leaders in both the business and it and bridge the gap driving successful solutions across the organization you will challenge and advise the business and your technology partners on the best use of data and reporting assets available while influencing business priorities and driving results business partners will depend on your skill set experience and creativity to build the best solution possible for their business need there will be a strong focus on delivering robust flexible solutions that increase business functionality and support the data science and analytics community responsibilities:+ enhance build and support our on-premise bigdata platform + engage with business partners across the company to understand their analytical needs regarding bigdata and data science + defining evangelizing and supporting technology and business partners who use the platform + act as a consultant advisor in terms of translating business requirements into technical solutions and recommend tools and processes to be used to accomplish the business needs + ensure that the team develops and maintains repeatable systemic processes that continue to make the team more efficient + work with business partners to identify areas where we can use technology to make business processes more efficient + assist in prioritization of projects to ensure overall business objectives and roi + motivate coach and serve as a role model and mentor for other development team associates members that leverage the platform + establish standards for access meta data definition elt processes and downstream data feeds + assist in the building of a lambda architecture to increase speed and flexibility for downstream analytics**qualifications**required skills:+ bs ba degree in computer science and minimum 5 years of relevant work experience + minimum 3 years of bigdata and bigdata tools with one or more of the following: hadoop sqoop hive kafka spark pyspark python or pig minimum 3 years of database skills with one or more of the following: sql server teradata db2 oracle netezza + minimum 2 years of leading technology engineering teams with a strong track record of execution and mentoring of your peers + minimum 2 years of analytical systems and project management experience + proven experience with designing dimensional data models + strong interpersonal verbal and written communication skills and ability to present complex technical analytical concepts to executive audience + experience leveraging data lifecycle methodologies and business intelligence best practices to drive high quality solutions + strong business mindset with ability to collaborate with business partners to identify needs and opportunities for improved data management and delivery + demonstrated passion for advanced data analysis strong experience with relational databases sql and proficiency using analytic tools to answer business questions + ability to work independently with business partners and management to understand their needs and exceed expectations in delivering tools solutions + demonstrated ability to balance the demands of multiple projects and multiple roles depending on size type of projectpreferred skills:+ master’s degree excellent planning prioritization problem solving skills + experience with cloud services in aws or azure experience with hortonworks implementation + experience with predictive analytics + experience with any of the following scripting languages r json parquet javascript regex powershell shell scripting experience with data brick hdinsight **title:** data steward-personal lines business intelligence**location:** united states-connecticut-hartford**job number:** 1800644overview of the positionthe hartford’s personal lines business intelligence team is seeking a business data analysis consultant to support data quality data governance and data standards practices this consultant is the data steward for personal lines as a data steward the consultant will collaborate with members of various business teams within personal lines and it to manage data as an asset and to ensure trusted quality data the data stewardship team is responsible for process improvement as well as serving in a consultant role to teams within personal lines providing best practices improved processes improved architecture data interpretation and training on governed data sources the primary responsibility of the consultant will be to document the name definition valid values quality security and appropriate use for data this position offers an excellent opportunity to learn the personal lines business while creating comprehensive knowledge repository for personal lines data assets that is shared across the enterprise the data stewardship team works closely with the data delivery and analytics groups within the larger team to drive governance and consistency in the data tools reporting and analysis published for broader use customers include all areas of personal lines such as actuarial data science operations billing marketing financial product management as well as external areas such as small commercial ais (actuarial information services) and the enterprise data organization the candidate should be able to think critically to tackle complex challenges thrives in a fast-paced environment and is seeking an opportunity to have an immediate impact on day one the consultant should also be a strong communicator eager to learn endlessly curious takes pride in hard work and is committed to continual career development responsibilities+ process efficiencies improvement - accountable to drive for improvement of all data quality core processes to execute with simplicity speed quality and efficiency + acts as the master owner and authority for all information for assigned application database and strategic projects + takes accountability and responsibility for maintaining business dictionary and data quality at the application database level + certifies information at the line of business (lob) level to be accurate and consumable+ establish and build effective relationships with business customers and external contacts to deliver on accountabilities + develop an understanding and knowledge of key business functions + responsible for building maintaining and managing business dictionary along with appropriate valid values within the enterprise metadata repository (datawiki) + integrate all application database technical metadata into the emr for available capabilities + assure all datawiki technical metadata is connected to the business context along with data lineage + partner with business and the enterprise data standards organization to ensure that data and metadata captured meets the organization’s standards and requirements + provide training or education to personal lines employees on governed data sources and metadata + implement roles responsibilities procedures processes and guidelines for respective application databases in compliance with enterprise business data standards + ensure business information is properly utilized in the operational environment + proactively suggest improvements to operational process or enterprise data governance & stewardship program + actively participates in the enterprise data office’s activities to improve the program achieve enterprise certification for critical data elements and resolve duplicate terms and definitions in emr + proactively enhances business semantics maintaining metadata and ensuring data quality + test profile of data to ensure proper operation and accuracy qualifications:experience & skillsqualifications+ understanding of personal lines business is a plus + knowledge of and access to front end data collection including café` qti 1view etc is a plus+ ability to work effectively in a team environment and independently + excellent communication skills to interact effectively with both technical and non-technical users at all levels of the organization + excellent analytical interpersonal and organization skills + self-motivated strong sense of ownership accountability and results oriented with the ability to manage time and schedules effectively + ability and willingness to work in an active production environment + must be able to work on multiple concurrent projects+ positive thinking: energetic and enthusiastic with a can do attitude + deals with issues in a proactive and reactive manner + accepts challenges and change+ willingness to learn new technologies methodologies and apply them + ability to understand and follow guidelines and standards + proven negotiation abilities + ability to drive a decision to fruition + customer focus: strives to give customers the best service takes the initiative to add value + technical tools: toad sql enterprise metadata repository microsoft excel + understanding of personal lines business and products is a plus+ understanding of im concepts is a plus+ degrees represented on current team include: business administration economics computer science finance international business and mathematics+ experience represented on current team include: business intelligence competitive intelligence data management it research actuarial pricing product management marketing sale & distribution direct to consumer and financebehaviors at the hartford+ deliver outcomes – demonstrate a bias for speed and execution that serves our shareholders and customers+ operate as a team player – work together to drive solutions for the good of the hartford+ build strong partnerships – demonstrate integrity and build trust with others+ strive for excellence – motivate yourself and others to achieve high standards and continuously improveequal opportunity employer females minorities veterans disability sexual orientation gender identity or expression religion age** no agencies please **job: business data analysisprimary locationunited states-connecticut-hartfordother locationsschedulefull-timejob levelindividual contributoreducation levelbachelor's degree (±16 years)job typestandardshiftday jobemployee statusregularovertime statusexempttravelyes 10 % of the timepostingfeb 23 2018 2:01:18 pmremote worker optionnothe hartford is an equal employment and affirmative action employer all qualified applicants will receive consideration without regard to race color sex religion age national origin disability veteran status sexual orientation gender identity or expression marital status ancestry or citizenship status genetic information pregnancy status or any other characteristic protected by law the hartford maintains a drug-free workplace and is committed to building inclusion and leveraging diversity our leading pharmaceutical client located in the whippany area is looking for a senior lead data manager to join their team department team descriptiononcology business unit – medical & data management data management serves as key subject matter expert on topics related to data management activities contributions include but are not limited: vendor selection and management mentoring junior staff training team members ongoing data review and reconciliation project data management position summarythe senior lead data manager (sr ldm) leads and or supports the execution of data management activities necessary for the preparation of submission data and required documentation for regulatory authorities and the greater clinical research community the sr ldm may assume the business role of a study data manager in one or more clinical studies in this capacity this staff member serves as the primary contact for oncology data management on the core study team position duties & responsibilities - the sr lead dm assumes operational and or oversight responsibility as study data manager for all assigned internal and outsourced studies and applies data management best practices- incorporates and maintains company standards in clinical studies and projects for all elements of themedical standards package - documents all activities adequately for all assigned studies according to sops and takes a lead role inqc activities which includes but is not limited to: initiating the sdmd maintaining documentmanagement systems coordinating and ensuring contributions from relevant functions (i e edc cdc etc ) requesting a timely qc of the sdmd informing relevant functions of results and ensuring propercommunication between functions qc manager and self so that all issues are reconciled - specifies and develops study specific ecrfs database structures and data consistency checks basedon medical standards the clinical study protocol and input from the study team - prepares tracks and implements standard plans (i e gdm study plan data management plan operational oversight plan etc ) to ensure proper governance of data management study set-up conduct and closure activities - accountable for data management activities necessary for the establishment of subject validity andanalysis set assignment including but not limited to the following: specification of protocol deviations planning and conduct of interim and final validity review meeting - supports study data management and data cleaning processes on an ongoing basis applying studyspecific documents and conventions - identifies and issues queries incorporates query replies tracks query status requirements preferenceseducation requirement(s):- bachelor’s degree (or equivalent) in natural sciences informatics or medical documentation- at least 5 years of study and or project level experience as a data manager in supportive and leading roles - at least 2 years of experience should demonstrate responsibility as a study leader - good understanding of the drug development process- strong organizational skills and able to collaborate with minimal supervision preferences:- basic sas programming knowledge or other database experience- significant experience in using data management methodologies and technologies (e g data warehousing electronic data capture)- demonstrated understanding of regulations and guidelines (e g ich gcp european clinical trialsdirective privacy rules [hippa] etc ) kforce has a client hat is seeking a big data analytics architect in pekin illinois (il) overview:the big data analytics architect will be accountable in the full lifecycle of analytics products providing our customers with strategic tactical and operational insights the candidate will be responsible for end to end solutions including strategy development design patterns product selection technical implementation and configuration of analytics solutions you will work with stakeholders and strategic partners to identify data warehousing and analytics requirements and opportunities as a big data analytics architect you will be a leader on our analytics team by selecting and implementing the full stack of data warehousing and business intelligence technology job responsibilities* steward of presentation and data consumption standards* leads strategy and standards for analytical and data science tool set* leads the development of design patterns* participate with enterprise data architect on defining enterprise data model* act as liaison to the business analytics team* participate in developing and ensuring compliance with architecture principles and standards for the various systems and components based on design patterns* provide architecture input to the architecture risk management process* provides feedback on solution architecture* ensures success of metadata management strategy* leads product selection process for analytics tools and processes* other duties as requested or assigned* typically this position will require an accredited four-year information technology degree a closely related technical degree or equivalent experience* a certification in architecture or a related relevant field preferred* progressive related experience of 8-12 years is preferred* experience in multiple business processes or organizations and experience in multiple it disciplines* good technical communication and influential skills are required * common skills: leadership teamwork interpersonal and influential skills* business skills & methods: business case development business process mapping strategic planning* enterprise architecture skills: architecture modeling building block design applications and role design and systems integration* program or project management skills: managing business change project management methods and tools etc * technical it skills: thorough understanding of modern data warehouse architecture and business intelligence; experience with visualization and analysis tools such as tableau qlikview powerbi and excel; thorough understanding of modern data warehouse architecture and business intelligence; experience with aws data stack a plus but not required* if needed a relocation package will be provided this person will need to be on-site majority of the timekforce is an equal opportunity affirmative action employer all qualified applicants will receive consideration for employment without regard to race color religion sex pregnancy sexual orientation gender identity national origin age protected veteran status or disability status *compensation type:*hours*minimum compensation:*0 00*maximum compensation:*0 00 company descriptiontechnoconsulting is a us based software consulting firm with referable experience among small to large scale organizations around the globe in addition to all sorts of application development and it support & maintenance we offer our clients flexible and cost-effective staffing solutions that allow them to maximize the power of technology for the ongoing success of their business combining unparalleled experience comprehensive capabilities across all industries and business functions technoconsulting collaborates with clients to help them become high-performance technology enterprises job description data architect analyst (f2f interview required)new york ny6+ months to hire job description:responsible for gathering and assessing business information needs and preparing requirements for reporting products performs various ad-hoc data analyses as required by the customers uses data mining and data analysis tools reviews and validates data loaded into the data aggregates for accuracy interacts with user community to produce reporting requirements provides technical consulting to users of the various data repositories and advises users on conflicts and inappropriate data usage responsible for prototyping and developing reporting solutions preparing test scripts and conducting tests maintains knowledge of software tools languages scripts and shells that effectively support the data environment on premise and on the cloud possesses working knowledge of relational database management systems (rdbms) data warehouse concepts and tools performs professional level work develops solutions requiring analysis and research responsible for critical work and or complex projects performed within a broader technical and business context required skills:typically possesses 5 years or more of relevant work experience · very strong sql coding skills · experience with procedural sql languages such as t-sql or pl sql · experience with tableau or similar reporting tools · experience with data analysis languages such as r and sas · experienced with and open to learning new cloud based technologies such as aws redshift and apache spark · generate analyses and provide concise insightful summaries from business questions · use the most efficient business intelligence or data management tools to provision data analyze data identify and document gaps and enable self- service · monitor and analyze trends with particular emphasis on those impacted by new product releases operational changes and new data collection techniques · support analyses using existing and new data sources of both internal and external origin and position them to be incorporated into enterprise platforms· identify existing reports and or processes for automation opportunities · understand the needs of internal and external information stakeholders and identify opportunities to meet those needs as the college board’s data capabilities and systems advance · execute plans to perform user acceptance testing and quality control essential to the deployment of reports and business information solutions · any other related activities as assigned preferred skills:· ability to deliver quality products in a fast paced team-oriented unit· strong knowledge of relational database systems data structures and data modeling concepts· experience in data management data quality and data automation· experience in project documentation and the design of process and data flow diagrams· knowledge of business intelligence and visualization tools such as cognos business objects or tableau preferred· strong analytical skills with demonstrated experience providing information reports and analyses from unstructured and vague business questions and problems· knowledge of system development life cycle· strong organizational skills and the ability to prioritize projects and information· excellent oral and written communication skills willing to communicate with business users to clarify requirements education certifications:bachelor’s degrees in mathematics statistics or computer science preferred or deep experience in data management qualificationsnulladditional informationplease send your resume and contact details to mathew(at)technoconsulting(dot)com or 732-200-2488 bex realty is a technology enabled real estate company based in south florida with seven offices spread out across the state by employing smart web design custom coding and innovative search engine optimization (seo) bex realty surpasses larger technology solutions by generating more and higher quality leads with little to no paid-for traffic or advertisement building on the 13 years of expertise in fostering relationships white glove services and ensuring that our customers have the best experience possible bex realty is launching bex technology services to transform these best practices and lessons learned into a next generation nation-wide platform for everything related to purchasing and owning a home to launch this effort bex technology is building out our engineering teams we're looking for experts who are interested in building the next generation technology from scratch leaders who are still highly technical and excellent individual contributors as well as successful team leaders this first phase of hiring for bex technology is focusing on our foundational members - building a team that consists of experts across devops (ci cd public cloud) data engineering services oriented architecture full stack web development and operations support if you are ready for the rewarding and challenging opportunity of building solutions from scratch ready to get your hands dirty and ready to help grow excellent engineering teams then continue reading below and apply today!bex technology is seeking an exceptional senior lead data engineer to design and implement the data aggregation solutions for the bex platform the bex technology platform needs to consume multiple external data feeds internal data sources and provide a standardized data set for multiple client applications the ideal candidate is passionate about data analytics and cutting edge scalable technology having implemented data solutions from scratch before the ideal candidate should have experience in taking legacy first-generation systems and migrating it to modern technologies responsibilities for this role:create maintain and support the etl pipelines ensure data integrity across the systemsreview maintain and progress current data designsdesign implement and maintain data warehouse solutionsdevelop and maintain scalable automated user-friendly systems that will support our analytical and business needsbuild strong relationships with system stakeholders and advocate for data oriented solutionsrequirementsrequirements:excellent communication and documentation skillsexpertise in advanced relational database design (mysql postgres or similar technologies)demonstrated ability in etl development survey platforms and data warehousing or similar skillsobject-oriented programmingstrong problem solving skills adaptable proactive and willing to take ownershipbachelors in computer science engineering mathematics or related fieldpreferred:prior experience in building platformsweb experience (php html etc a bonus)5+ years of experience in a data engineer or bie role with a technology companyexpert in writing and tuning sql scriptsexperience with reporting tools like tableau or similar bi toolsbenefitshealth care plan (medical dental)retirement plan (401k with matching)paid time off (vacation sick & public holidays)training & developmentfree coffee snacks & refreshments **principal software developer data analytics****description**join the data analytics team where you will provide software development algorithm development and data analytics \(to include big data analytics data mining and data science\) to enable data\-drive decisions and insights\ experience with analytic techniques and methods \(e\ g\ supervised and unsupervised machine learning link analysis and text mining\) as well as software languages is a must\ knowledge of data analytics in the heathcare industry\ software languages and big data technologies needed include: java python r c\# c sas analytic engines hadoop parallelized analytic algorithms and nosql and massively parallel processing databases\ the successful candidate will have the ability to formulate problems prototype solutions and to analyze result\ key functions:* formulate data analytic problems* get and cleanse data* employ analytic methods and techniques* develop analytic algorithms* analyze data* predictive analytics* analyze cms health data* analyze medicare fraud data**qualifications**required qualifications:hands\-on software development skills \(java r c c\# python javascript\) with analytic applications and technologies getting and cleansing data data storage and retrieval \(relational and nosql\) data analytics and visualization and cloud\-based technologies\ experience with the map reduce programming model and technologies such as hadoop hive and pig is a plus\ experience analyzing medicare fraud data and cms health data is needed for some pprojectspreferred qualifications:minimum qualifications:bs and 10 years of related experience including 2 years of leadership experience\ **job** sw eng comp sci & mathematics**primary location** united states\-massachusetts\-bedford**this requisition requires a clearance of** top secret sci**travel** yes 10 % of the time**job posting** feb 21 2018 3:04:35 pm**req id:** 00049468 we make life better every day the clorox company (nyse: clx) is a leading multinational manufacturer and marketer of consumer and professional products with about 7 900 employees worldwide and fiscal year 2017 sales of $6 0 billion clorox markets some of the most trusted and recognized consumer brand names including its namesake bleach and cleaning products; pine-sol® cleaners; liquid plumr® clog removers; poett® home care products; fresh step® cat litter; glad® bags wraps and containers; kingsford® charcoal; hidden valley® dressings and sauces; brita® water-filtration products; and burt's bees® natural personal care products and renewlife® digestive health products the company also markets brands for professional services including clorox healthcare® and clorox commercial solutions® more than 80 percent of the company's sales are generated from brands that hold the no 1 or no 2 market share positions in their categories as a member of our information technology team you'll develop flexible technology solutions and services that support the ever-changing needs of the company information technology is committed to developing team members' technology expertise and leadership skills individuals in these key positions apply their technical and business process expertise toward building an integrated enterprise data foundation additional opportunities exist in areas such as operations systems engineering and production support key responsibilities (in descending order of importance): * as a data engineer you would work with our data analytics and ai technology advancement center to make things easier faster and more reliable for thousands of employees and customers of clorox to make data driven decisions * your demonstrated mastery of data analysis techniques and data tools such as sql python c# java nosql etc will help create the infrastructure that powers our analytical insight and data science efforts at clorox you will work alongside the senior data architect to ensure that the data analytics and data science mission is being achieved through tool setup process creation training and advocacy * if you love data data warehousing big data and more importantly empowering the use of data this could be the role for you you will be expected to interact with customers on a regular basis take individual responsibility for features (from customer contact to final delivery) work with more senior members on larger efforts and take part in continuously improving our product and company minimum requirements: years and type of experience: * bs in computer science or engineering and 10+ years of overall relevant experience in application programming and integration * 5+ years of experience in data ingestion and storage systems for big data environment using at least one of the cots integration tools like - snaplogic webmethods tibco talend informatica and or custom scripting in python java * 2+ years of must have experience in using apache beam google datafow apache spark in creating end-to-end data pipelines * 2+ years of data engineering experience with big data environments and writing map-reduce jobs using java scala or python * 5+ years of webservice and api integration experience using rest api json node js and python * experience with connecting and integrating with at least one of the platform - google cloud amazon aws microsoft azure and or various data providers like - facebook or tweeter api integration * experience in writing hive sql and ansi sql based complex queries and data aggregation and transformation * experience with continuous build deployment and team development environments like - jenkins chef puppet jira etc * experience troubleshooting and taking responsibility for small features from design to user delivery * enthusiasm for the field and professional development improvement outside the day to day job skills and abilities: communications: above average communication and presentation skills strong written and oral skills ability to listen to requirements and translate them into technology solutions education level degree: bs in computer science or engineeringequal opportunity employer minorities women protected veterans disabled this role will be responsible for collaborating with the digital data analytics team to help discover insight within large amounts of data that lie across multiple platforms the digital data strategist will translate data analysis into strategies that align with actionable goals along with having hands-on experience analyzing large-scale datasets sifting through data and creating engaging stories around market intelligence competitor benchmarking internal capability gap assessment & instrumentation and other strategic initiatives responsibilities:+ consult with internal stakeholders to identify business problems and map them back to analytics objectives+ collaborate with the analytics team to package deliverables in articulate and business-intuitive presentations+ provide analytics and data science thought leadership for digital+ select features building and optimizing classifiers using machine learning techniques+ data mining using state of the art methods and best practices+ processing cleansing and verifying the integrity of data used for analysis+ perform ad-hoc analysis and presenting results in a clear manner+ create automated anomaly detection systems and constant monitoring of performancebasic qualifications:+ *minimum of 3 years of related experience with large-scale data manipulation analytic tools and data visualization+ *minimum of 3 years of related experience producing both tactical and strategic analytic products and briefing senior level managers+ *minimum of 3 experience using scripting languages for data processing+ *certification as a scrum product owner (cspo) is preferredskills and knowledge:+ must have a mix of technical ability leadership and presentation skills + experience using tableau to handle large data sets+ experience with agile scrum delivery+ highly proficient in ms word visio excel and tools such as jira confluence+ excellent communication skills both written and verbal+ exhibit strong analytical project management storytelling+ strong understanding of machine learning techniques and algorithms+ experience with data visualization tools such as d3 js ggplot etc+ proficiency in using query languages such as sql hive pig etc+ experience with nosql databases such as mongodb cassandra hbase redis+ experience with common data science toolkits such as r weka numpy matlab etc excellence in at least one of these is highly desirable+ experience with statistical software such as r python or other deep learning languages+ strong analytical skills with the ability to extract the context of the data through standard data analysis methodologies+ good applied statistics skills such as distributions statistical testing regression etc+ good scripting and programming skills using javascript and net languages+ demonstrated ability to deal with ambiguity and rapidly shifting priorities+ experience leveraging cloud platforms (aws azure) is preferred*represents basic qualifications for the position to be considered for this position you must at least meet the basic qualifications city national bank is an equal opportunity affirmative action employer minorities females individuals with disabilities veterans**note** : this preceding job description has been designed to indicate the general nature and level of work performed by employees within this classification it is not designed to contain or be interpreted as a comprehensive inventory of all duties responsibilities and qualifications required of employees assigned to this job **note** : candidates should be advised that city national bank does not pay interviewee travel expenses or relocation expenses for candidates who are hired unless previously agreed *li-mb1equal opportunity employer minorities women protected veterans disabledmarketing company descriptionadvanced onion is seeking a new member to join our team of qualified and diverse individuals the qualified applicant will become part of advanced onion's team supporting the federal government’s mission to provide solutions for the department of defense the products delivered by this team actively help to ensure the safety of service members and their families around the world we are actively searching for qualified candidates for this potential opportunity we are currently identifying candidates for this future effort this position is contingent upon award of task order to advanced onion job descriptionarchitect develop and maintain exposed data service applications (either publishing and subscribing sides) designed to use the latest web- and mobile-device technologiesparticipate in the full software development lifecycle (ecosystem) of mobile device applicationscollect and shape requirements for a mobile device-based ecosystem designed to support service oriented architecture (soa) and web 2 0 for analytical and mission needsdevelop and maintain applications that expose and consume web services using either native (api) or web-based mobile clientsexperience with ios and or android operating systemsexperience with html 5experience with simple object access protocol (soap) web servicesexperience with wi-fi cdma gsm 3g 4g lte or near field communications (nfc) technologiesexperience developing native (api) or web-based applicationsexperience with offline data-centric architectures that provide for disconnected intermittent and or limited (dil) communications​this position description is not intended as nor should it be construed as exhaustive of all responsibilities skills efforts or working conditions associated with this job this and all positions are eligible for organization-wide transfer management reserves the right to assign or reassign duties and responsibilities at anytime qualificationseducational requirements: bachelor's degree from an accredited college or university in computer science mathematics or engineering or a mathematics-intensive discipline fine arts or graphic design or an applicable training certificate from an accredited institution experience requirements: 2 to five years of intensive and progressive experience in a related field including design development and maintenance of web-based applicationsclearance requirements: an active position of trust clearance (or higher: secret or ts)additional informationbenefits preview: https: www zenefits com benefits-preview ?token=7856ebb2-c135-493e-bc01-9fb02d0c251edisclaimers:a competitive salary for each individual will be commensurate with experience and education as it relates to the position requirements unless specifically stated otherwise each position is onsite at the specified location due to regulatory security criteria all candidates must have a u s citizenship h1b visa holders h1b sponsorships and u s resident green card holders will not be considered check individual job opportunities to see if a security clearance is required applicants under final consideration for hire will be subject to a thorough background check and security clearance checks this position description is not intended as nor should it be construed as exhaustive of all responsibilities skills efforts or working conditions associated with this job this and all positions are eligible for organization-wide transfer management reserves the right to assign or reassign duties and responsibilities at anytime advanced onion is an equal opportunity employer your information will be kept confidential according to eeo guidelines summarya data steward is the role within christian care ministry responsible for utilizing our data governance processes to ensure fitness of data elements - both the content and metadata data stewards have a specialist role that incorporates processes policies guidelines and responsibilities for administering our data in compliance with policy and or regulatory obligations the overall objective of a data steward is data quality; in regard to the key critical data elements existing within our operating structure and of the elements in their respective domains this includes capturing documenting (meta)information for the elements identifying owners custodians providing insight pertaining to attribute quality aiding with project requirement data facilitation and documentation of capture rules christian care ministry is a community of christians that requires its employees to share its christian religious beliefs and practices ccm complies with all anti-discrimination laws applicable to religious employers a career with christian care ministry (ccm) is more than just a job it’s a ministry ccm’s mission is to connect and equip christians to share their lives faith talents and resources with others the foundation of ccm’s mission is acts 2:42-47 essential duties and responsibilities a data steward ensures that each assigned data element:has clear and unambiguous data element definitiondoes not conflict with other data elements in the metadata registry (removes duplicates overlap etc )has clear enumerated value definitions if appropriateis still being used (remove unused data elements)is being used consistently in various computer systemsis being used fit for purpose = data fitnesshas adequate documentation on appropriate usage and notesdocuments the origin and sources of authority on each metadata elementis protected against unauthorized access or changeinsures the accuracy of new data entering the data ecosystemessential skillsyou are comfortable with metadata management data dictionary and similar productivity toolsyou are very familiar with the healthcare industrygood written and verbal communication skillsgood customer service skillsgood written documentation skillscomfortable with technology and willing to learn new technical applications and toolsresearch and learningyou may be the right fit ifyou are a self-starter who will figure out what it takes to get the job doneyou love to learn and learn fastyou are timely and reliable enjoy working with and in teams and are a team playeryou possess a proven ability to execute on developed strategiesyou understand actively participate and are passionate about continuous improvementyou care about the details and getting things exactly right gray areas drive you crazy!you are organized!you have a great attitudeeducation experienceba bsc computer science or data management 5 years experience in data management perkscompetitive pay401(k) and great benefitsan on-site gym and cafeteriafree gourmet coffee stationsccm values our employees each individual is given the opportunity to excel in their strengths advance their skill set and have direct input into the decisions that ultimately drive the ministry’s success this job description in no way states or implies that these are the only duties to be performed by the employee(s) incumbent in this position employee(s) will be required to follow any other job-related instructions and to perform any other job-related duties requested by any person authorized to give instructions or assignments this document does not create an employment contract implied or otherwise other than an "at will" relationship expedia## expedia ecommerce platform - ecpthe ecp organization was built to enable the business to scale globally while supporting the needs of our rapidly growing family of brands the ecp team powers expedia inc ’s travel revolution with a world-class platform of ecommerce technologies and services operating at global scale we bring product technology and operations together to accelerate innovation and enable test and learn constantly pushing ourselves to evolve and innovate for the business #### data services and platforms - dspour mission in dsp is to power reliable trustworthy and scalable data products fueling decision-making and competitive advantage for expedia inc we use a blend of open source custom development and commercial tools to deliver and operate petabyte scale data platforms used across the business our ecosystem includes multiple cloud providers massive hadoop clusters parallel databases and a broad mix of leading data management and analytics tools #### position summaryare you passionate about crafting world-class products and platforms that form the data ecosystem that powers billions of decisions every single day?the technical product manager is a key member of the edw product organization responsible for gathering requirements and translating them into robust user stories aligned to business value along with driving the development lifecycle and collaborating with engineering teams you will work closely with product and engineering leadership along with a wide array of partners to define priorities and work on daily basis in an agile environment to build and implement outstanding products you are technically minded and have a good grasp of how data can improve an organization you have worked in a data centric environment and can demonstrate a good understanding of how the right combination of products can simplify the challenge of extracting value from data you can effectively move from a conversation with a data scientist about the challenges of code deployment to writing a detailed technical document that describes a complex data management challenge you have the drive and enthusiasm to join a technical team on a journey towards greatness working together to overcome any challenge that presents itself and collaborating with multiple teams to get to the optimal outcome #### responsibilities* capture requirements from partners across the organization including technical detail and business goals* bring these requirements to life in the form of use cases and user stories that enable engineers to solve the business challenges presented* communicate effectively with both partners and engineering teams to support ongoing partnerships and to understand how the products are used in order to spot further opportunities* ensure functional specifications match business goals and support the realization of those goals though user testing product assessments and customer training documentation* prioritise engineering backlogs and align delivery to overall team priorities* collaborate with development managers scrum masters product managers as well as other teams around the enterprise* contribute to design discussions and challenge technology assumptions where required* maintain technical and business facing documentation#### qualifications* excellent knowledge of data platforms technologies and the inherent challenges these technologies face for an analyst or data engineer along with a solid understanding of the product lifecycle * experience in data analytics-centric environment across technologies such as hadoop kafka teradata highly preferred along with some knowledge of cloud infrastructures (aws gcp azure) as an added bonus* great organizational skills and a high level of oral written communication skills* ability to creatively solve challenging business technology problems* experience gathering business requirements and turning them into detailed functional design specifications using agile methods* ability to prioritise own workload across multiple projects simultaneously and be comfortable working in a dynamic environment* ability to work with cross functional teams to deliver the implementation of new features as well as resolve operational issues* familiarity with web-based and or data oriented product development including tiered architecture data organization and performance optimizationexpedia is committed to creating an inclusive work environment with a diverse workforce all applicants will receive consideration for employment without regard to race religion gender sexual orientation national origin disability or age expedia is committed to creating an inclusive work environment with a diverse workforce all qualified applicants will receive consideration for employment without regard to race color religion gender gender identity or expression sexual orientation national origin genetics disability age or veteran status this employer participates in e-verify the employer will provide the social security administration (ssa) and if necessary the department of homeland security (dhs) with information from each new employee's i-9 to confirm work authorization * posted 30+ days ago* full time* r-22182 build the best:would you enjoy working on one of the biggest and fastest ad-engines in the marketplace?we are already serving audiences in the hundreds of millions and processing tens of billions of requests per month and as the demand for our services is expected to multiply you will be challenged with: scaling accordingly developing new components and seamless implementations allowing for zero downtime (you code with the confidence speed and steadiness of a world class surgeon!)you will be responsible for the lifeblood of the yume platform and have the opportunity to be a superhero in the world of high-volume real-time open-source systems job profile & responsibilities: design enhance and implement scalable reliable and maintainable technologies for our backend data and etl platformdistill business requirements into design specificationsenforce code quality through test driven development ensure that our product is carrier grade in terms of reliability scalability and performancelead efforts in capacity planningwork closely with the testing team to design intelligent testing strategiesconceptualize software requirements based on both external and internal inputsapply and tailor best practices in software processes and quality to achieve fast cycle time developmentwork closely with off shore development teamsrequired qualifications: bs ms in computer science or a related field3+ years of relevant experience developing software for a high transaction volume high availability environment is mandatory prior work experience in online advertising or media industry is a mandatory strong knowledge of data structure modeling replication & distributed data object relational database mappingcoding and system design using hadoop hive flume java j2ee xml soap web servicesin depth expertise in linux apache tomcat hadoop technologies is mandatory experience with hadoop messaging frameworks like active mq and other scalable technologies used in on-line advertising is required ability to solve complex problems with simple solutionsstrong working knowledge of distributed systems technology; web based technologies application servers application and database performance tuning approaches and tools strong empathy for users and customersstrong technical documentation & presentation skills yume is a leading independent provider of digital video brand advertising solutions its proprietary data-science driven technologies and large audience footprint drive inventory monetization and enable advertisers to reach targeted brand receptive audiences across a wide range of internet-connected devices designed to serve the specific needs of brand advertising yume’s technology platform simplifies the complexities associated with delivering effective digital video advertising campaigns in today’s highly-fragmented market yume (nyse: yume) is headquartered in redwood city ca with european headquarters in london and eight additional offices worldwide this is a rare opportunity to be part of an organization that driving the shift of tv brand advertising to digital ***principals only***yume is currently not accepting resumes applications from third party recruiting agencies any unsolicited resumes applications will be deemed a gift and no fee shall be due to the agency as a senior data engineer you’re responsible for proposing designing and implementing technical solutions to improve our clients’ data-driven insight capabilities these solutions encompass all aspects of development for data analytics solutions from front end data visualization and backend data management including modern data architecture data integration and master data management (mdm) the ideal candidate will also have experience with tools and techniques surrounding etl mdm business intelligence data visualization analytics and big data you are expected to be involved in the full lifecycle of delivering projects from creation and review of proposals with the sales team to accountability for delivery of customized solutions that deliver long-term value to our clients you are ideally a thought leader and have a perspective as to the evolution and future of information management responsibilitiesserve as lead in delivery and oversight define solutions plan information management projects and provide extensive technical and strategic guidance to project team membersensure best practices and standards are adhered to across data warehousing and business intelligence projectsconduct and lead white-boarding sessions workshops design sessions and project meetings playing a key client-facing role for the projectpartner closely with business users to identify and analyze requirementsunderstand clients’ operations and business processes to design and recommend end-to-end information management solutions including database design structures data integration and reporting frameworksevaluate potential technology tool solutions that meet business needs and facilitate discussions toward desirable outcomescreate functional and technical documentation related to data architecture and business intelligence solutionseffectively diagnose isolate and resolve complex problems pertaining to data infrastructure including performance tuning and optimizationdefine solution roadmaps to achieve end-state solutions including capability prioritization and cost estimatesreview project communications and deliverables to ensure consistent high-quality client project experienceengage with client senior leadership to build rapport address strategic concerns and translate complex technical terminology and concepts to an appropriate level (e g explain to non-technical staff)follow technology and industry trends and educate sales and solutions teams on new developments that will impact existing or potential clientswork with pre-sales teams to develop proposals and pricing models for complex and innovative customer solutionsreview and assess technical proposals and contracts validating feasibility of technical approach design and rationalizing effort and resource estimateswe are actively looking for professionals with experience in the following enterprise information management disciplines:data warehousingdata warehouse design and architecturedata mart design and architecturedimensional modelingdata integration data servicesanalytics and reportingperformance managementbalanced scorecards & dashboardsad hoc and guided analytics olappredictive analyticsdata visualizationdata governancedesign and initiation of data governance organizationdesign and initiation of data governance processmetadata managementmaster data managementdata qualitydata auditing and securitypreferred skillsdegree in computer science information technology or related5+ years of progressively responsible post-baccalaureate experience in business intelligence or data management architecting and implementing enterprise data and analytic solutions3+ years of experience leading data warehousing business intelligence implementation teams managing project scope providing technical leadership and mentoring junior resources3+ years of experience with database design from conceptualization to database optimization using common cloud and on-prem platforms such as aws redshift azure sql data warehouse bigquery sql server teradata oracle etc deep experience or knowledge in foundational information management arenas such as master data management data governance modern data architecture and data integrationexperience in data architecture modeling etl data quality and mdm tools including ibm infosphere and microsoft sql server integration services data quality services and master data servicesexperience in architecting and implementing emerging technologies tools such as aws hadoop cloudera and or hortonworks to address predictive analytics and unstructured data use casesexperience in designing and implementing next generation architecture solutions incorporating big data nosql cloud-based analytics and real-time analyticsexperience in working with large volumes of data from disparate data sources across complex business processes and functionsmust have strong leadership and interpersonal skills to resolve problems in a professional manner lead working groups negotiate and create consensusmust be able to astutely operate in and navigate through client organizationsmust have strong written oral communication and presentation skillsmust be highly self-motivated entrepreneurial humble and curiouslocationwork remotely from home your favorite coffee shop or hatchworks travel will be required occasionally to meet with clients located within the continental united states travel costs are reimbursable if you think the hatch works experience could be a great fit for you please click "apply" located below to submit your resume for consideration to learn more about our organization visit us at www hatchworks com as a manager within the informatics team you will contribute to data warehouse design etl design and development and lead developers and support teams for the innovative analyses provided by xcellerate® using enterprise etl architectures and innovative algorithms we seek smart focused passionate self-starters who bring energy new ideas and practical experience to a fast-paced and dynamic team are obsessed with delivering useful and elegant solutions and care deeply about their customers and each other education: minimum required: bs in life sciences computer science information systems or related discipline ms preferred responsibilities design develop and review data architectures information flows and etl code that support diverse data analyses in clinical data servicesmanage the etl and bi development and delivery for the project portfolio of data warehouse and analysis needs across clinical data servicesmanage time allocation of development team and budget allocations across the project portfoliodevelop and maintain partnership with matrix project teams lead the design development and testing efforts of the development teamsprovide mentoring and supervision to development and support teamscollaborate with diverse product team stakeholders to prioritize support issues by the onsite and offshore support teamleverage your technical expertise in data modeling and business intelligence to assure development of industry-leading solutionsminimum 7-10 years of data warehousing and data integration experience in complex business environments with 2-5 years of proven etl design experiencestrong experience in logical and physical data modeling in sql server oracle or similar rdbms systems design and development experience with informatica bods or similar etl softwaredemonstrated ability to effectively troubleshoot issues and perform root-cause analysisstrong interpersonal and communication skills preferred relevant experience in the pharma or cro industry a plus experience with integrating clinical trial data is preferred experience in healthcare or life sciencesprogramming skills in c# java javascript or related languages experience with 21cfr part 11 software validation experience with master data management approaches tools and design patterns job descriptionwe are looking for a highly motivated senior data engineer to join our data engineering team you will be part of a center of excellence that enables best in class leveraging of data to support advanced analytics you will serve as subject matter expert on data structures data sources and tools techniques and methodology to process data that enables advanced analytics the team provides ad-hoc analytical and decision-making support while also aiding in the development of iterative analytical prototypes this position requires experience with working in large volumes of transactional data and an understanding of data mining test & learn and statistical techniques you will help shape the strategic direction of business analytics and ensure analytical best practices and techniques are used throughout the enterprise will do:+ identify extract assess manipulate and analyze internal and external data from multiple sources including pos transaction data syndicated data and other pertinent information to support advanced analytics + leverage sql and scripting skills to help business partners (merchandising ops finance marketing tech etc ) to implement requested reporting functionality and accessibility build data and reporting proofs of concept as needed; develop reporting and analysis templates and tools + collaborate with the entire team to design new algorithms drive optimization and continuously improve our analysis + conduct category basket item customer and financial analysis+ develop kpis the levers contributing to the kpi and quantifying how movement in the levers contributes to overall kpi changes + support the data science team’s efforts for the continual use of testing to drive smart decision making while understanding the nature of a decentralized corporate structure + coaching and development of data engineers + work with business partners to identify gaps in analytical processes and reporting; then problem solve with various other teams to fill gaps must have be:+ bachelor’s degree in related field and 5-8 year’s relevant experience or equivalent combination of education and experience master’s degree preferred + exceptional sql skills: hands-on deep experience querying large complex data sets + data blending: hands-on experience with analyst-focused data blending tools (e g alteryx) preferred + intellectual and analytical curiosity – initiative to dig into the “why” of various results and a desire to grow in responsibility to become a strategic thought leader in pricing and category analysis + knowledge of retail business to include all aspects of pricing merchandising retail operations financial operations and reporting a plus + proven ability to take complex business problems from concept to completion and make the results accessible and relevant to business users at all levels often requires interaction with multiple departments and senior leadership + self-directed innovative thinker with a strong attention to detail and commitment to consistently meeting timelines and operating from a sense of urgency + creative problem solving skills maturity to handle ambiguity flexibility to perform in a dynamic environment and ability to multitask across projects simultaneously + positive attitude ability to work well and fast under pressure exceptional customer service attitude comfortable in a decentralized structure where results are based on cooperation with and influence of others + work effectively with others on a team + ability to travel up to 10% + ability to maintain confidentiality at whole foods market we provide a fair and equal employment opportunity for all team members and candidates regardless of race color religion national origin gender sexual orientation age marital status or disability whole foods market hires and promotes individuals solely on the basis of their qualifications for the position to be filled who are we? well we seek out the finest natural and organic foods available maintain the strictest quality standards in the industry and have an unshakeable commitment to sustainable agriculture add to that the excitement and fun we bring to shopping for groceries and you start to get a sense of what we’re all about oh yeah we’re a mission-driven company too whole foods market attracts people who are passionate - about great food about the communities they live in about how we treat our planet and our fellow humans - and who want to bring their passion into the workplace and make a difference learn more about careers at whole foods market here!privacy policy 2017-041 sr data architectabout rigilrigil is an award-winning woman-owned 8(a) small disadvantaged business that specializes in technology consulting strategy consulting and product development we value teamwork and strive to build strong leaders job typefull timelocation(s)silver spring marylanddescriptionrigil is recruiting for a sr data architect to provide the data architecture data modeling data management and web services development to push and pull the data to from other sources the individual will also manage services standards for reference data and capabilities required to describe organize integrate share and govern data and information as an enterprise asset in an application-independent manner duties and responsibilitiesdevelops and manages enterprise data architecture; enterprise data modeling of conceptual logical physicaland canonical modelsresponsible for working with business process teams and leaders to understand their business information requirements and to classify organize make available and govern the data needed to meet those requirements develops exchange data flows functional documentation logical and data models and xml validation schemas (xsd) ensures customers end users developers and other stakeholders have a common understanding of exchange standards and models works with project teams to develop and document information requirements and business rules in the form of conceptual logical physical and canonical data models data flow diagrams formulates the data architecture to meet business information requirements and support the business rules utilizes knowledge of data sources data flow and system interactions of existing automated systems provides technical assistance to users and team members on the various international industry united states government and national information exchange model (niem) data exchange standard(s) analyze data requirements through life cycle and value chain analysis and ensure enterprise integrationdetermine and manage reference data (code lists and enumerations)determine and manage authoritative and approved replicated sources of all types of data manage metadata and data registriesprovide data quality specifications analysis measurement and improvement recommendationsensures technical compliance and alignment with standards develop web services using json and other latest technologies ensures customers end users developers and other stakeholders have a common understanding of data architecture modelskeeps management and team members apprised of project status through written reports formal briefings and discussions applies knowledge of business objectives overall design and operation principles of enterprise systems applies knowledge of nosql databases (mongodb) required skillsa qualified candidate will possess the following:ability to understand core business and corporate strategy skill in building component based xml or json structures to support a service-oriented architecture (soa) knowledge of uml modeling standards xml transformation xml validation schemas (xsd) and validation tools such as (but not limited to) xslt and schematron experience with icao aixm aeronautical information data experience with information architecture concepts (common information model canonical data model enterprise information management metadata management data governance and master data management) experience with data modeling data mapping data profiling and data qualityability to design and develop solutions with incremental steps toward long-term strategy visionability to communicate written and verbally technical concepts in business termsability to identify opportunities for efficiency and areas of potential conflictability to consolidate themes in order to maximize resourcesability to build consensus and influence stakeholders to obtain buy-inproficiency in enterprise data management as well as experience of relational and object-oriented database technology; sql ms sql server oracle mongodb nosql databasesa plus understanding of current approaches to data modeling (entity relationship uml) knowledge warehousing data martability to utilize enterprise architecture concepts to evaluate and develop data standards and related processes especially expertise in xml json education requirementsbs in computer science or similar technical fieldrequired experiencea qualified candidate will have a minimum of five years of experience in the following areas:web services developmentknowledge and experience with enterprise architecture frameworksproficiency using enterprise architecture and data management tools (systemarchitect enterprise architect virtualization) federal business knowledge a plusaixm and or aeronautical background a plusdenodo tableau system architect and enterprise architect tool knowledge a plus application instructionsto be considered for this position please apply at www rigil com careers rigil is an equal opportunity employerrigil considers applicants for all positions without regard to race color religion sex national origin age marital status sexual preference personal appearance family responsibility the presence of a non-job-related medical condition or physical disability matriculation political affiliation veteran status or any other legally protected status rigil requires a pre-employment background investigation what you’ll be called: business intelligence data architect and developerwhere you’ll work: kwri headquarters—austin txnamed happiest company to work for 2018; one of the best places to work in austin tx; and winner of the prestigious training 125 award from training magazine in 2017 keller williams realty international (kwri) thrives within a creative and collaborative culture where being at the forefront of real estate is our primary goal who are we looking for?as the largest real estate franchise company in the united states our goal is to lead with technology that is disruptive in the real estate industry you will be joining fast-paced teams in developing saas applications for our franchise (us and worldwide) offices and real estate agents to operate their independent real estate businesses the bi architect and developer will work closely with our bi team solutions and infrastructure architects product managers dbas db developers and infrastructure team to re-architect and integrate our current reporting data systems from multiple legacy environments as well as new systems in development you will assist in the coordination and organization of multiple data sources and recommend strategize and implement data warehouse structures to enable bi reporting for kwri and the field what will you do?work with dba’s business units and bi team members to analyze current data organization data structures and models as they pertain to bi development of reports and dashboards assist with design development and implementation of database architectures and database solutions for systems (old and new) used across the organization to enable integration with other data sets create data integration workflows and sql for queries and analytics work with dba’s to ensure reporting metrics are used consistently in api’s that feed applications so that reporting metrics are in alignment across all systems along with dba team assist in the migration of reporting structures from oracle and mysql to redshift work with bi team members to gather requirements from subject matter experts product owners and executives to clearly identify design and articulate data relationships build awareness of data management and quality so that everybody is aware of the impacts of quick decisions months down the line assist in defining best practices to ensure the quality of the data warehouse work with bi team to consult design and support reports and dashboards for kwri and the field required qualificationsbachelor’s degree (b a ) from a 4-year college or university in computer science or equivalent 3-5 years of experience developing and implementing data warehousing solutions strong pl sql development skills expert level data modeling ability 5 years developing business intelligence reports and dashboards (tableau and sap preferred) experience with etl tools and process preferred qualificationsexperience with oracle mysql redshift experience with big data storage and analytics technologies is desired manage and control acquisition of various sources of data this would include obtaining engine data from various aerospace (and potentially other industry domains) devices saving the data appropriately to provided internal repositories and manipulating such data into formats as needed for transfer to other systems the definitions of data formats may also be created and will at a minimum be supported and maintained regardless of their origin :• bachelors of science in engineering• 3-6 years of software development experience• embedded c programming experience• communication protocols including usb 2 0• experience with data analysis manipulation transfer etc • ui development preferably windows xp to win10 application• fault management software including detection clearing reporting• travel will be 15-25%• must be eligible to work in the united states without sponsorshipworking knowledge of:• microsoft excel• vba• sql and pl sql• configuration updatessoft skill :• customer-driven• communication  written || verbal || face-to-face• following processes works with finance bi hr it clinical quality and other coes and lead sessions to gather requirements;helps maximize data-driven outcomes by ensuring data quality and integrity;directs the development and maintenance of novel data analytics solutions;works with bi and it teams to define and create reporting and dashboard tools to support the evolving reporting needs of the organization;performs other related duties as assigned knowledge skills and abilities:must have working experience in reporting and analytics space prior to workday experience;knowledge of workday data sources and reporting functional areas;excellent analytical critical thinking and problem-solving skills;excellent oral and written communication;proactive solutions-oriented and goal-driven;must be able to understand the big picture and execute;ability to evaluate validate and manipulate a large volume of data;strong task management skills and organizational skills required;this position requires the use of independent judgement;position requires approximately 20-30% travel time additional job descriptionadditional job descriptioneducation and experience criteria:high school diploma or equivalent required; associate degree or some college preferred;bachelor’s degree a plus;at least 5 years of related computer science information technology and or data management experience required 7 -10 years of experience preferred;a minimum of 1½ years of direct workday experience developing reports dashboards and scorecards;expertise in microsoft office products specifically excel word outlook and powerpoint;project management experience desired about us:crowdtwist is seeking a data engineer for our growing team based in new york city in this role you will have full ownership of the crowdtwist data pipeline to satisfy our ever growing data needs you will be involved in many different projects as we continue to grow and enhance our platform you'll be working with large data sets across millions of users and will directly influence the loyalty program offerings for many of our clients such as pepsi marvel carhartt nestlé purina amc’s the walking dead toms steve madden and zumiez we are a venture-backed nyc based company that provides the most comprehensive omni-channel loyalty & analytics solutions for industry leading brands we're relaxed experienced hard driving and are changing our industry this role is for an engineer who loves to roll up their sleeves dive in and tackle any problem with speed and precision you will work closely with the data engineering and product teams to implement mature and optimize our data warehouse reporting and ai data models which drive actionable insights across the platform you will be utilizing your experience with object-oriented software design and implementation data modeling and building high-performance highly scalable data processing applications we operate in an agile manner so you must thrive in a fast-paced environment and enjoy shipping product in this role you will work with a broad tech stack including but not limited to:javapythonoracleredisrabbitmqdockercloveretlamazon web services (aurora redshift dynamodb s3 lambda kinesis elasticache)this is what we're using today but we're looking to take our data processing to the next level we're looking for free thinkers that are open to new approaches and tech stacks about you:have a ba bs in computer science engineering information systems or equivalent experienceself-starter with 3+ years of experience as a data engineer dealing with large complex data scenariosproven ability to work with varied data infrastructures - including relational databases column stores nosql databases and file-based storage solutionsexperience with both compiled and scripting languagesexpert level sql skillsexcellent communication (verbal and written) and interpersonal skills and an ability to effectively communicate with both business and technical teamswrite great code view it as a craft and love cranking out solid workwe have a fun generous company culture that's built on our fundamental principle that when you give more you get more:competitive compensation and generous employee benefits package including fully paid medical dental & vision plans for employees & their dependents!stock options for all employees!we bring toys to the office but still think the most fun thing to do is build productcollaborative work environment that fosters bonds beyond the workplacewe treat our people well with employee recognition programs and referral bonusesbetter than average team building activities such as paintball go cart racing shuffleboard organized sporting teams and off-site retreats provide lunches drinks and snacks so our team can be hungry for other thingslearn from and teach each other at crowdtwist upet friendly office environmentif this sounds like you get in touch we're cool relaxed experienced hard driving changing our industry and looking for smart people like yourself to help tackle tough technical challenges this is a full-time position based in our new york city office apply by sending us your resume and cover letter relocation assistance will be available if needed join our big data team and work on dynamic long-term projects the majority of our team members are long-term employees who enjoy consistent work and a collaborative team approach!key accountabilities:provides thought leadership to clients; assists sales team in technical sales meetings as well as the creation of development statements of work leads the implementation of development statements of work; leads big data developer staff (i e scrum master like) designs develops and implements web-based java applications to support business requirements follows approved life cycle methodologies creates design documents and performs program coding and testing resolves technical issues through debugging research and investigation relies on experience and judgment to plan and accomplish goals performs a variety of tasks a degree of creativity and latitude is required typically reports to a supervisor or manager or architect codes software applications to adhere to designs supporting internal business requirements or external customers standardizes the quality assurance procedure for software oversees testing and develops fixes contribute to the design and develop high-quality software for large scale java spring batch hadoop distributed system loading and processing from disparate data sets using appropriate technologies including but not limited to those described in the skills section requirementsrequires a bachelor's degree in area of specialty and experience in the field or in a related area; with an expectation of a master's degree in area of computer science will review candidates with an equivalent level of experience that hold multiple certifications in big data technologies must have excellent communication skills as this will be a highly customer facing position familiar with standard concepts practices and procedures within a particular field (etl analytics ooa ood uml design patterns re-factoring networking unit and component level testing) expert in hive sql and ansi sql - great hands on in data analysis using sql ability to write simple to complicated sql in addition to having the ability to comprehend and support data questions analysis using already written existing complicated queries familiarity in dev ops (puppet chef python) understanding of big data concepts and common components including hadoop components (pig hive flume kafka storm mapreduce hdf nifi falcon oozie hbase impala bigsql) spark cloud components (google cloud platform aws azure) and multiple languages (java scala python) benefitsfor more than 20 years moser consulting has been the go-to source for exceptional it talent with the ability to self-manage at moser consulting our people are our #1 asset we hire the best people welcome them like family connect them with opportunities and let them do what they do best: produce innovative solutions to technology problems our culture gives us a competitive advantage by keeping our employees happy healthy and by lowering stress levels in a very demanding industry it is no accident that we are recognized as one of the best places to work in indiana we focus on giving employees: an incredible work space; a fun collaborative and creative atmosphere; an extremely generous compensation package; and dozens of outstanding and unique perks usually not found at one company job descriptionwe are looking for someone who can help us implement our vision off creating an advanced analytical services platform which leverages internal and third party data to provide information and insights a successful candidate should be comfortable working with a wide range of technologies have a background in data intensive applications and extensive experience in the google cloud platform (gcp) responsibilities:leads the technical architecture and design for new data platforms on gcp analyzes data to gain insight into business outcomes and build solution to support decision-makingbuild prototypes and proofs-of-concept to validate ideasprovide architecture and technology leadership across gcp in big data and cloud data processing technologieshelp transition from our current data analytics infrastructure to a high performance scalable analytics platform on gcpmodeling business processes for analysis and optimizationdesign build and maintain secure data infrastructure and processes on gcpdesign build and maintain data structures and databases on gcpdesign build and maintain data processing systems on gcpdesign build and maintain data pipelines on gcpmapping business requirements to data representations on gcpinteract with the project team members responsible for developing reports interfaces data conversion programs and application extensions presents technical information in easily understood terms in both verbal and written formworks closely with fellow it staff and project management to document standards and procedures for the definition of reference architectures (architecture roadmap)interacts and works directly with customer team members on projects and support issues facilitates communication upward and across project teamsskills experience required:8+ years’ experience in information technology in the data space3+ years minimum of demonstrable experience in a technical leadership rolestrong knowledge of data warehouse architecture and data integration concepts real time experience on gcp data engineering project implementationsreal time experience of data migration projects from on-prem to cloud expertise in the following gcp products – big query data flow data prep data proc stackdriver cloud functions airflowexpertise in any of the following etl elt tools – talend mattilionstrongly desired – certifications on data engineer from googledesired - exposure in informatica power center teradataexcellent communication skills both verbal and writtencomfortable working with all project stakeholders (business users architects project managers business analysts developers test analysts production support team)entertainment industry experience is strongly desirededucational requirements:bachelor's degree in computer science or related discipline master's degree in a related field is preferred designs and builds relational databases develops strategies for data acquisitions archive recovery and implementation of a database works in a data warehouse environment which includes data design database architecture metadata and repository creation translates business needs into long-term architecture solutions defines designs and builds dimensional databases develops data warehousing blueprints evaluates hardware and software platforms and integrates systems evaluates reusability of current data for additional analyses reviews object and data models and the metadata repository to structure the data for better management and quicker access computing environment: mcsa 2003 mcse 2003 m2003 net m2008 net mcitp sa mcitp ea csa:ws2012 mcse:ws2012 mcsa:ws2008 mcsm:ds m2012 net mcm: windows server 2008: directory mca: ms windows server directory ccaa gcwn mbachelors degree in a computer science or related technical discipline or the equivalent combination of education technical certifications or training or work experience five (5) years relevant experience must possess it-i access clearance or have a current single scope background investigation (ssbi) or equivalentmust possess certification meeting the dod 8570 01 iat level ii (security+)experience working with modeling tools (e g one or more of the following: er win power designer erstudio or other similar tools)as a trusted systems integrator for more than 50 years general dynamics information technology provides information technology (it) systems engineering professional services and simulation and training to customers in the defense federal civilian government health homeland security intelligence state and local government and commercial sectors with approximately 32 000 professionals worldwide the company delivers it enterprise solutions manages large-scale mission-critical it programs and provides mission support services gdit is an equal opportunity affirmative action employer all qualified applicants will receive consideration for employment without regard to race color religion sex sexual orientation gender identity national origin disability or veteran status or any other protected class **data visualization developer****description**as an early to mid\-career data visualization developer you might be wondering what makes mitre different than other government contractors and consulting firms\ many of our recent hires answered yes to at least one of the questions below:+ are you interested in contributing to technical projects that you can feel passionate about? + are you tired of having your performance tied to billable hours? + do you want to feel like your ideas are valued and you are more than just an fte? + are you interested in working for a company where a 40\-hour week is the norm and not the exception? + do you want exceptional benefits including tuition reimbursement for coursework towards an advance degree in a technical discipline?the model\-based analytics department is seeking a motivated creative data visualization developer to apply cutting edge tools and techniques to challenging problems facing the us government\ this position offers the opportunity to interact directly with government leaders in a fast\-paced and dynamic atmosphere with the ability to make a difference and impact outcomes with solid analysis\ our employees are expected to work on multiple projects cross\-pollinate good ideas across the government and continuously learn\ we are truly an interdisciplinary department that thrives on having a large toolbox to support data\- and model\-driven decision making\ we value people who can bring diverse perspectives\ our department works to solve problems in the public interest in partnership with the government\ we also actively conduct independent research in areas of interest to our government partners\ once established with a proven track record of quality delivery and consistent performance members of our department have latitude to select projectsto be successful you will: + be passionate about leveraging modern ui technologies to enhance stakeholder interaction with analytic models\ + have an innate curiosity and interest in developing research questions and testing hypotheses with open ended tasking\ + work with a spectrum of government sponsors to gain understanding of their challenges evaluate possible solutions and conduct insightful actionable analyses\ + support the development and application of a variety of analytical models to sponsor challenges with a willingness to adapt and learn\ + present results in an intuitive actionable manner that can be understood by all sponsor audiences regardless of technical expertise\ + provide relevant day\-to\-day tasking for one or more junior staff\ + lead projects throughout their life cycle including conceptualization requirements definition data procurement development integration and socialization\ **qualifications**# required qualificationscandidates will be required to have:+ significant academic concentration in at least one of the following disciplines:- computer science- data science- mathematics- operations research+ excellent written and verbal communication skills\ + demonstrated ability to rapidly prototype interactive data visualizations in web applications \(d3\ js angularjs node\ js\) r shiny python jupyter notebooks or other open\-source technologies\ + demonstrated ability to manipulate large datasets with at least one modern programming language or business intelligence platform \(e\ g\ python r sas matlab java c \)\ + ability to apply modify and formulate algorithms and processes to solve challenging problems\ + demonstrated experience leading customer facing engagements emphasizing interactive data visualization\ # preferred qualificationspreference will be given to candidates with desired experience including:+ research experience\ + prior experience working with databases \(e\ g\ postgresql oracle mysql mongodb neo4j\)\ + prior experience developing programmatic solutions in a collaborative environment that utilizes source code management \(e\ g\ git mercurial svn\)\ + advanced degree in computer science data science mathematics operations research or other related technical field of study\ **job** sw eng comp sci & mathematics**primary location** united states\-virginia\-mclean**other locations** united states\-massachusetts\-bedford**this requisition requires a clearance of** secret**travel** yes 5 % of the time**job posting** mar 7 2018 7:49:23 am**req id:** 00049229 mastech digital provides digital and mainstream technology staff as well as digital transformation services for leading american corporations we are currently seeking a senior manager (data engineering) for our client in the industrial supply distribution domain we value our professionals providing comprehensive benefits exciting challenges and the opportunity for growth this is a contract position and the client is looking for someone to start immediately &duration: 8 months contractlocation: buffalo grove illinoiscompensation: depends on experience&role: senior manager (data engineering)&primary skill: etl&role description: the senior manager (data engineering) would need to have at least 6+ years of experience &required experience and skills:&- 6+ years of experience in data warehousing - experience designing and building export transform and load (etl) processes - experience with aws redshift and or google big query - experience in sql or similar languages and development experience in python - data architecture skills - experience scaling and managing 5-10-person teams &education: advanced degree in computer science math physics or another technical fieldexperience: minimum 6+ yearsrelocation: no this position will not cover relocation expensestravel: n alocal preferred: yes&we are looking only for candidates willing to join us directly as w2 employees &recruiter name: nikhil kumarrecruiter phone: 877-884-8834 (ext: 2174)&eoe company descriptionnulljob descriptioncore competencies – must have:5-10 years informatica (etl) experiencebdmtdmdata qualitybig datadata warehousestar shemavery strong verbal and written communication skillsbachelor's degree in a technical field such as computer science computer engineering or related field required nice to haves:unix hands on knowledgequalificationsnulladditional informationall your information will be kept confidential according to eeo guidelines the data integration quality assurance analyst will conduct full life cycle testing and traceability of data through the application data warehouse extracts and reports this is a strategic role and the analyst is expected to proactively define implement and enforce data testing approaches the data integration quality assurance analyst will work across the company supporting our client solutions and work directly with data architects dbas etl developers software engineers and project managers responsible for designing developing automating monitoring and maintaining data quality assurance processes that transfers data to and from internal and external locationsensure data integrity as data flows from our customers into our systems and into our data warehouseinteract with business analysts and technical leads to estimate testing efforts and ensure accurate requirements fulfillmentdevelop and document requirements and solutionsengaged in project planning and delivering to commitmentsparticipate in daily stand-up meetings planning meetings and review sessions (using scrum agile methodology)participate in design and code reviewsinteract with cross-functional teams to ensure complete delivery of solutionstroubleshoot issues making recommendations and delivering on those recommendationsensure data quality approaches satisfy internal compliance policies and are hipaa compliantknowledge & experiencesignificant experience with sql sql server database and etl tools and techniquesmust be a sql expert and have a passion for ensuring data is consistent and accurateexposure to data modeling tools like erwin visio toad designer etc demonstrated working knowledge of data integration tools such as secure ftp serversmust be a highly motivated team player who is a quick learner and someone who can work independently or on a team driving solutions with minimal supervisionexperience with full software development lifecycle methodologies including agilea flexible team player capable of juggling multiple priorities and willing to do what it takes to meet critical deadlinesan effective communicator who is able to work with executives engineers managers qa testers internal customers and external partnersstrong written communication skills including but not limited to requirements and testing documentationunderstanding of data warehouse approaches industry standards and industry best practices3-5 years developing and implementing data quality processes and approachesexperience with sas and or a business rules engine technology is a plusformal education & certificationuniversity degree in the field of computer science it business administration or another rigorous discipline at abbott we're committed to helping people live their best possible life through the power of health for more than 125 years we've brought new products and technologies to the world -- in nutrition diagnostics medical devices and branded generic pharmaceuticals -- that create more possibilities for more people at all stages of life today 99 000 of us are working to help people live not just longer but better in the more than 150 countries we serve abbott has an immediate need for a big data architect to join our team in chicago illinois working with the big data and advanced analytics team in commercial digital and innovation (cdi) group of business technology service (bts) • big data architect would player coach who will not only architect solutions but will also be hands on in guiding the team and building the solution• servant leadership – able to garner respect from product team(s) and willingness to get hands dirty to get the job done• knowledgeable – deep understanding and experience with big data ecosystem open source projects data value chain and architecture patterns – experience with all listed is not required• communication – strong verbal and written communication skills• facilitation – able to lead architecture work sessions and articulate architecture components to data engineers devops architects and senior management• assertion – able to ensure architecture concepts and principles are adhered to must be able to be a voice of reason and authority and make the tough calls• continual improvement – continually be growing in your craft learning new tools and techniques to ensure architecture stays relevant reliable and scalable• conflict resolution – able to facilitate tough discussions and facilitate alternatives or different approaches• transparent– bring disclosure and visibility to the business about development progress and grow business trust• influencer – embrace the role that will influence and impact transformation of online distribution of premium digital assets for abbottcore job responsibilities+ architect and implement roadmaps and bring to life revolutionary new analytics and insights + provide guidance and platform selection advice for common big data (distributed) platforms+ design data flow and processing pipelines for ingestion and analysis using modern toolsets such as spark on scala kafka flume sqoop and others + develop and recommend novel and innovative -- yet proven and demonstrated -- approaches to solving business and technical problem using analytics solutions+ design data structures for ingestion and reporting specific to use case and technology + provide data management expertise in evaluating requirements and developing data architecture and refining platform components and design data management includes appropriate structuring stewardship of data semantics syntax of data attributes coding structures and mapping schemes design and develop code scripts and data pipelines that leverage structured and unstructured data + guide and coach junior data engineers devops engineers share best practices and perform code reviews+ collaborate with cross-functional teams to utilize the new big data tools+ manage architecture for data interchange using microservices batch workloads and apispart of the centralized data office responsible for building and managing advanced analytics products solutions for abbotteducation experience+ ba bs degree or equivalent experience; computer science or math background preferred+ 12+ years of relevant technology architecture consulting or industry experience to include information delivery analytics and business intelligence based on data from hybrid of hadoop distributed file system (hdfs) non-relational (nosql redshift) and relational data warehouses + 3 or more years of hands on experience with data lake implementations core modernization and data ingestion + 3 or more years of hands-on working experience with big data technologies; hadoop scala python language and big data frameworks kafka hive hdfs mapreduce yarn kafka pig oozie hbase spark and aws software such as s3 and emr + familiarity with commercial distributions of hdfs (hortonworks cloudera or mapr)+ 2 or more years of hands on experience designing and implementing data ingestion techniques for real time and batch processes for video voice weblog sensor machine and social media data into hadoop ecosystems and hdfs clusters + experience in architecting and engineering innovative data analysis solutions+ experience with architectural patterns for data-intensive solutions+ understand terminology involved in data science and familiarity with machine learning concepts+ customer facing skills to represent big data organization well within abbott environment and drive discussions with senior personnel regarding trade-offs best practices project management and risk mitigation+ demonstrated ability to think strategically about business product and technical challenges in an enterprise environment + current hands-on implementation experience required; individual contributors only need apply + strong verbal and written communications skills and ability to lead effectively across organizations + the ability to provide strategic and architectural direction to address unique business problems + system design and modeling skills (e g domain driven design data modeling api design+ experience with other nosql platforms focused on what requirements drive technology choices + strong knowledge of standard methodologies concepts best practices and procedures within a big data environment+ a certain degree of creativity and latitude is required as well as flexibility in job role and working hours during critical deliveries+ a natural sense of urgency initiative and a positive team player philosophy to be reflected in daily work ethics + proficient understanding of distributed computing principles with the ability to architect and explain complex systems interactions including data flows common interfaces apis and methods available + ability to work well with a cross-functional geographically dispersed team and customer base + experience with saas cloud based offerings products+ must have designed and built a scalable big data infrastructure that has been in use for several years+ experience designing architectures that have to work in a highly-regulated industry (healthcare or finance preferred) + experience designing architectures that can incorporate data from multiple data sources+ experience educating other team members on a technology stack+ positive attitude quick learner with strong desire to make big impact with innovative technical solutionsan equal opportunity employerabbot welcomes and encourages diversity in our workforce we provide reasonable accommodation to qualified individuals with disabilities to request accommodation please call 224-667-4913 or email corpjat@abbott com for more details feel free to call us at 908-409-6252role: data warehouse developer - investment insurancelocation: new york nyjob tyoe: contract - through year end 2018interview mode: phone and in personposition summary:the developer will be responsible for assisting implementation of a data management solution this includes the analysis design development testing installation and initial maintenance of the solution and acting as interface with customers and other developers to determine the most efficient and cost-effective approach to meeting business requirements the project will utilize a variety of hardware and software technologies and may include new code construction modifications to existing modules and package implementations this position will apply disciplined software development processes and utilize leading edge technologies to engineer and implement solutions to business problems he she will display technical and functional competence in data management standards and uphold client's commitment to ethical business practices candidate responsibilities:collaborate with team members it and software consultants to ensure functional and business specifications are converted into flexible scalable and maintainable solutions designsdevelop and implement data warehouse strategies that utilize data marts and or data warehouse systems to enhance business processes and manage business intelligencedesign and manage data models for applications metadata tables and views ensuring system data and reporting stability security performance and reliabilityestablish analytic environments required for structured semi-structured and unstructured datadevelop data quality metrics that identify gaps and ensures compliance with investment and enterprise wide standardsactive in role in quality assurance (qa) and user acceptance testing (uat) for data solution initiatives by providing testing support attending team meetings and documentationfunctional skills:5+ years of overall experience in investments and insurance data and systems management data science programming and information systemsexperience with and implementation of any enterprise data management tools for investments (markit edm goldensource edm etc) including builds of various in-bound and out-bound feeds (intex trepp markit bloomberg epam etc )understand and support business data needs; understand data layout and resulting business impacttrack record of collaborating with it and vendor teams to provide technical solutions and process improvements delivering end-to-end products processes on schedule and budget as per business requirements and sdlc standards including addressing performance and ad-hoc requestsexperience with data warehouse and bi systemsworking knowledge of financial investmentsworks effectively with associates across business and management teams and with corporateadapt to changing business priorities and environmentsstrong agile and waterfall experience; ability to break down complex business requirements into small user storiesposition qualifications:bachelor degree in management information systems computer science or a related fieldunix scripting experiencestrong development experience: java sql vba r python or similar programming languagesamazon web services experiencetableau experience (or any other visualization tool)strong knowledge of excelstrong mathematical analytical and communication skillsthanks & regards rajiv mishrachenoa information services inc tel: 908-409-6252fax: 732-549-6041e-mail: rmishra@chenoainc comwww chenoainc comwww chenoahealth com - provided by dice (data management ) and (cloud or aws or amazon web service or amazon webservices or ec2 or s3 ) and(markit or goldensource or golden source or intex or trepp or bloomberg or epam or investment company or investment companies or insurance company or insurance companies ) and(visualize or visualization ) and(tableau or cognos or business object or microstrategy ) and(datawarehouse or business intelligence ) caliber justice is seeking a data services engineer to join their team! the data services engineer works closely with sales and engineering teams to properly scope and design customer engagements to validate data structure design and to assist our customers with data conversions as well as assists the technical support and field service teams with data issues extracts transforms and loads data from legacy systems performs server administration and tasks analyzes scrubs and converts customer data and other data sources for use in various caliber products identifies possible conversion issues and suggests avoidance solutions creates conversion processes from various legacy data sources defines best practices for converting data if exact matches do not exist provides support to the client and professional services teams as needed core competencies & skills:results driven professional with the ability to work well and deliver under pressure in a fast paced environment passionate team member committed to team success customer oriented; delivers on-time superior solutions that exceed customer expectations adaptable and responsive to innovation and change identifying areas for improvement to support business success successful communicator at all levels using all media with excellent interpersonal skills demonstrates personal accountability excellence and integrity job specific competencies & skills:high level of proficiency in technical programming language for the work assignment ability to analyze and summarize highly technical information ability to apply state of the art technology solutions to software engineering problems attention to detail and methodical work process outstanding interpersonal skills and ability to delivery excellent customer service to all stakeholders education and experience bachelors’ degree in computer science or related or an equivalent of experience and education may be accepted 1 + years of experience preferred caliber justice is a leading provider of software for the offender management and jail management markets the jailtracker solution is a highly specialized product offering empowering correctional staff and administrators to interact with accurate real-time information and advanced situational command and control caliber justice is a business unit of harris which is a member of the constellation software inc group of companies constellation software is a rapidly growing conglomerate of vertical market software (vms) companies; each focused upon dominating its respective market niche constellation's growth is based on a simple strategy: identify promising vms firms; acquire them; and then integrate them into the constellation family while building on their fundamental strengths to help them become world class organizations harris is an eeo aa disability vets employer #weareharris **job id:** 60143br**city:** brooklyn**state:** new york**country:** united states**category:** information technology**job type:****description:**united technologies corporation (utc) has a deep history of innovation that brings together big thinkers problem solvers and a culture for pushing the boundaries of what?s possible our global reach and rich history uniquely positions us to succeed in the new digital economy our investment in digital innovations will make travel better people safer and urbanizing cities more comfortable and connected utc is committed to leading in the digital era and will unleash the size and scale of its businesses on the digital world of big data and the internet of things we are looking for the very best thought and technology leaders who will help grow our digital accelerator in brooklyn new york ? in a vibrant urban oasis overlooking the iconic brooklyn bridge come join us in this journey **about united technologies ? nyse utx:**with revenues of approximately $57 billion united technologies corporation (utc) is a fortune 50 company that provides high technology products and services for the aerospace and commercial building industries our aerospace businesses include pratt & whitney and utc aerospace systems pratt & whitney is a world leader in the design manufacture and service of aircraft engines utc aerospace systems is one of the world?s largest suppliers of technologically advanced aerospace and defense products our commercial building businesses include otis elevator and climate controls & security otis is the world?s largest manufacturer and maintainer of people-moving products including elevators escalators and moving walkways utc climate controls & security is a leading provider of heating air conditioning and refrigeration systems building controls and automation and fire and security systems these companies are leading to safer smarter sustainable and high-performance buildings ranked among the world?s greenest companies we do business in virtually every country of the world and have over 200 000 employees globally **position:**we are seeking highly motivated data architects to join our digital accelerator in brooklyn ny we are looking for people who have a passion for building systems that analyze and uncover digital insights from large complex streams of data derived from the worlds most advanced aerospace aviation and building management systems as a member of a cross-functional team of product managers software engineers data scientists and designers you will combine your problem solving and analytical skills to identify quantify and solve real world problems leveraging best of breed digital technologies this role will also work closely with utc?s engineering design operations marketing and service delivery teams to challenge the boundaries of what?s possible this position will provide the unique opportunity to operate in a start-up-like environment within a fortune 50 company our digital focus is geared towards releasing the insights inherent utc?s best-in-class products and services together we aim to achieve new levels of productivity by changing the way we work and identifying new sources of growth for our customers **qualification:**+ 7+ years related experience designing data science and analytical solutions + deep knowledge of machine learning statistics optimization or related field+ experience with r python perl matlab+ experience working with large data sets and distributed computing tools including spark hadoop hive etc+ experience with industrial commercial applications of data science including prognostic and health management supply chain optimization and human capital management is a significant plus+ excellent written and verbal communication skills along with the ability to well work in cross functional teams+ work experience in programming data engineering and machine learning+ motivated self-starter with a strong enthusiasm to learn+ results-oriented with strong communication and customer focus+ ability to deal well with ambiguous and undefined problems**education:**+ advance degree in computer science statistics or operations research or related technical disciplineunited technologies corporation is an equal opportunity affirmative action employer all qualified applicants will receive consideration for employment without regard to race color religion sex sexual orientation gender identity national origin disability or veteran status age or any other federally protected class **qualification:**+ 7+ years related experience designing data science and analytical solutions + deep knowledge of machine learning statistics optimization or related field+ experience with r python perl matlab+ experience working with large data sets and distributed computing tools including spark hadoop hive etc+ experience with industrial commercial applications of data science including prognostic and health management supply chain optimization and human capital management is a significant plus+ excellent written and verbal communication skills along with the ability to well work in cross functional teams+ work experience in programming data engineering and machine learning+ motivated self-starter with a strong enthusiasm to learn+ results-oriented with strong communication and customer focus+ ability to deal well with ambiguous and undefined problemsunited technologies corporation is an equal opportunity affirmative action employer all qualified applicants will receive consideration for employment without regard to race color religion sex sexual orientation gender identity national origin disability or veteran status age or any other federally protected class w2 only! no c2c!alis software is currently seeking a senior data architect with over 15 years’ experience for a 6 month contract opportunity with our client located in austin texas please note! you must be able to interview onsite in austin tx to qualify for this opportunity!in this role you will be:develops best practices and methodologies to mature enterprise information management (eim) best practices and capabilities develops enterprise data policy and standards that support the development of an enterprise data governance framework defines future state data management model with input from hhs business stakeholders and participation in the texas enterprise information management group at dir performs oversight of data strategy and roadmap implementation represents and or supports the cdao developing touch points between information technology and cads to enforce eim defined best practices governance and data policies serves as data architecture and eim expert for the hhs system develops process for alignment of data architecture frameworks to support the eim goals and principles develops a detailed knowledge of the underlying hhs data assets and data products and provides subject matter expert on content current and potential future uses of data assists in development and implementation of data quality processes and metrics within the edg data quality program particularly data quality improvement processes related to hhs mastered data assists in exposing and defining interrelationships between core elements of the hhs master data repository and its data products responsible for consulting with the business owners of data assets providers of enterprise analytics and information technology leadership to develop enterprise analytics strategy for future funding design and implementationprovides expertise in data harmonization master data development and governance data quality and metadata management provides guidelines on creating data models and various standards relating to governed data reviews changes to technical and business metadata realizing their impacts on enterprise applications and ensures the impacts are communicated to appropriate parties ii worker skills and qualifications minimum requirements:excellent understanding of data architecture principles capabilities and best practices with the proven ability to define document and apply standards and best practicesexperience with business processes data management data flows data relationships data quality standards and processes and proficiency with data analytical and quality toolsextensive experience and domain knowledge in the areas of defining and designing master data management and data governance architecture frameworks and capabilitiesexperience with informatica data management and architecture tools particularly those relevant to data quality master data and management of enterprise metadataextensive architecture experience and knowledge in the areas of reference data data quality metadata managementexperience leading and evolving corporate culture to a data driven organizationexperience performing oversight of data quality and master data tool implementationstrong communication analytical and interpersonal skills at all levelsstrong proven ability to work successfully with technical and non-technical groups and manage multiple responsibilitiesstrong technical writing skillsexperience with healthcare or insurance business domaincertified data management professional (cdmp) preferredother special requirements: e-verify requirements are outlined in appendix a section p immigration of the standard terms and conditions for information technology staff augmentation contract because of the nature of the information in the system that will be implemented all entities must sign a data use agreement (dua) as a condition of employment face-to-face interview required required skills & qualificationsbs in computer science or similarinterested? please apply by clicking the link provided or reach out to us directly at 512 827 2266 or send your resume directly to leslie goodwin@alissoftware com based in beautiful irvine we are calamp the pure-play pioneering leader of the connected car connected truck and broader internet of things (iot) marketplace currently we are seeking a data architect with at least 6 years of experience this is an exciting opportunity for those who wish to work for a stable well-established company that builds transformational technologies within the revolutionary domain of iot we believe that people are our greatest asset and we are committed to being an employer of choice in our industry calamp offers an engaging and diverse work environment that permits our people to take pride in their contributions and share in the company’s success our employees can expect the space to showcase their talent sharpen their skills develop new capabilities and be a part of a global team that develops revolutionary technologies we proudly offer the stability and security of a large publicly-traded tech company without the rigidity and red tape in particular we offer:meaningful work with the potential to disrupt an entire industryvisionary leadershipexcellent compensation packagesan extensive suite of medical and retirement benefitsflexible time off policy and accommodating work scheduleseducation assistance program (tuition reimbursement)access to cutting-edge tools and technologiesinnovative intelligent collaborative teammatesresponsible for delivering highly available heterogeneous database servers on multiple technology platforms lead all database maintenance and tuning activities ensuring continued availability performance and capacity of all database services across every business application and system is expected to consider current practices to develop innovative and reliable solutions that will continuously improve the service quality to the business create update maintain data warehouse for business product reporting needs refine physical database design to meet system performance requirements identify inefficiencies in current databases and investigate solutions diagnose and resolve database access and performance issues develop implement and maintain change control and testing processes for modifications to databases ensure all database systems meet business and performance requirements coordinate and work with other technical staff to develop relational databases and data warehouses advocates and implements standards best practices technical troubleshooting processes and quality assurancedevelop and maintain database documentation including data standards procedures and definitions for the data dictionary produce ad-hoc queries and develop reports to support business needs creation and maintenance of technical documentation perform other management assigned tasks as required must havesbachelors or masters degree in computer science mathematics or other stem discipline6+ years of experience in working with relational databases (e g redshift postgresql oracle mysql)2+ years of experience with nosql database solutions (e g mongodb dynamodb hadoop hbase etc )5+ years of experience with etl elt tools (e g talend informatica aws data pipeline preferably on talend )strong knowledge on data warehousing basics and relational database management systems and dimensional modelling (star schema and snowflake schema)configuration of etl ecosystems and perform regular data maintenance activities such as data loads data fixes schema updates database copies etc experienced in data cleansing enterprise data architecting data quality and data governancegood understanding of redshift database design using distribution style sorting encoding featuresworking experience with cloud computing technologies aws ec2 rds data migration service (dms) schema conversion tool (sct) aws gluewell versed in advanced query development and design using sql pl sql query optimization performance and tuning of applications on various databasessupporting multiple dbms platforms in production qa uat dev in both on premise and aws cloud environmentsstrong plusesexperience with database partitioning strategies on various databases (postgresql oracle)experience in migrating automating and supporting a variety of aws hosted (both rdbms & nosql) databases in rds ec2 using cftexperience with big data technology stack: hadoop spark hive mapr storm pig oozie kafka etc experience with shell scripting for process automationexperience with source code versioning with git and stashability to work across multiple projects simultaneouslystrong experience in all aspects of the software lifecycle including design testing and deliveryability to understand and start projects quicklyability and willingness to work with teams located in multiple time zonesif you are a talented data architect with at least 6 years of experience and an interest in the internet of things domain we want to speak with you interviews are occurring this week and next so apply now if interested #li-rb1 company descriptionlumen solutions inc is a technology consulting services company based in virginia we provide a wide array of experienced business and it professionals supporting clients from solution design to implementation and support we specialize in strategy development portfolio management and enterprise architecture job descriptionrole: senior big data engineer location: washington dcduration: long term contractqualifications• 5+ years of full-time industry experience• 2+ years of experience with java scala is preferred• 2+ years of strong scripting ability in ruby python bash• 2+ years of working knowledge of relational databases and query authoring (sql)• design and operation of robust distributed systems• love to use and develop open source technologies like kafka flume hive impala and spark• rigor in high code quality automated testing and other engineering best practicesability to write reusable code componentspreferred qualifications• bs ms in mathematics engineering or computer science• working knowledge of u s healthcare industry• experience working with large data environments – petabytes or hundreds thousands of terabytes• experience with no sql technologies such as hbase or cassandra • past experience with data warehouse data analytics solutions and technologies especially etl toolsadditional information-interview format: 1st – phone screen 2nd - panel interviewlocation: washington dcduration: long term contractthanks & regards k rajeshwarlumen solutions incphone: 703-349-1462 wright brothers (51050) united states of america seattle washingtonat capital one we’re building a leading information-based technology company still founder-led by chairman and chief executive officer richard fairbank capital one is on a mission to help our customers succeed by bringing ingenuity simplicity and humanity to banking we measure our efforts by the success our customers enjoy and the advocacy they exhibit we are succeeding because they are succeeding guided by our shared values we thrive in an environment where collaboration and openness are valued we believe that innovation is powered by perspective and that teamwork and respect for each other lead to superior results we elevate each other and obsess about doing the right thing our associates serve with humility and a deep respect for their responsibility in helping our customers achieve their goals and realize their dreams together we are on a quest to change banking for good senior data engineer+ capital one (yes the “what’s in your wallet?” company!) is rethinking the way the world approaches personal investing at https: www capitaloneinvesting com we’re experimenting innovating and delivering breakthrough experiences at https: youtu be ywhzox0ytc0 for 65 million customers we love to be curious to dream and ask “what if?” oh and we love to write code at https: developer capitalone com + as a capital one investing software engineer you'll work on everything from customer-facing web and mobile applications to highly-available highly-scalable micro-services to back-end systems with sophisticated data pipelines all on the cloud!you will drive design implementation testing and release in an agile environment using modern methodologies and open source tools whether a new feature or a bug fix you will lead your work and deliver the most elegant and scalable solutions all while learning and growing your skills most importantly you’ll work and collaborate with a nimble autonomous cross-functional team of makers breakers doers and disruptors who love to solve real problems and meet real customer needs+ **the person we're looking for:**+ has a sense of intellectual curiosity and a burning desire to learn+ is self-driven actively looks for ways to contribute and knows how to get things done+ is deliriously customer-focused+ values data and truth over ego+ has a strong sense of engineering craftsmanship takes pride in the code they write+ believes that good software development includes good testing good documentation and good collaboration+ has great communication and reasoning skills including the ability to make a strong case for technology choicesyou might notice that there’s no mention of specific languages or technologies that’s because your commitment to learn new things is every bit as important to us as what you have already done maybe even more so because we don’t want to be doing the same thing tomorrow that we’re doing today you accept change want to grow and evolve into a better member of the team but just in case here are some buzzword-worthy tools and technologies we currently use so the search engines can help you find us: **machine learning data science hadoop mongodb zookeeper cassandra aws** python docker micro-services go java scala clojure **basic qualifications:**+ at least 5 years of experience in software development+ at least 3 years of web services software development+ at least 2 years of python+ at least 1 year of setting up a ci cd pipeline**at this time capital one will not sponsor a new applicant for employment authorization for this position ** ****job purpose:****in an expert leadership capacity incumbents in this position are responsible for identifying and solving problems affecting one or more domains within the business the analyst utilizes strong business acumen and analysis skills to understand and quantify the steps required impacts and benefits associated with implementing or changing business process rules policy etc to impact performance the analyst is responsible for investigating business situations identifying and evaluating options for improving business systems defining requirements and ensuring the effective use of information in meeting the needs of the business ****business domain analysis***** utilizes expert level knowledge in specific business domain(s) and industry to provide operating context for analysis and business problem solving * subject matter expert in specific business area(s)* frames and structures complex cross-departmental problems with minimal supervision and may guide others in this function* independently completes compiles analysis and provides leadership guidance in* identifying relationships in data* changes and trending over time* creation of histograms tables etc as a tool in analysis* performs exploratory and descriptive analysis* routinely executes standard business analysis techniques* swot cost-benefit analysis forecasts competitive analysis gap analysis* evaluates actions to improve the operation of a business system* applies forecasting models and research methods to differentiate signal from noise * utilizes strong conceptual understanding of predictive analytics and data to partner with data analysts and data scientists* support a core metrics and key performance indicators for business leading and lagging indicators and related historical performance* understands core data sources and relevance in business context limitations and implemented business rules i e suppression rules* combines data from multiple sources to answer complex questions**business data modeling reporting & business intelligence*** leads others to solve complex problems; uses sophisticated analytical thought to exercise judgment and identify innovative solutions* uses data modeling tool to build attribute level data dictionary content* work with data analyst to document source to target mapping specification documents for the etl development team* acquires and translates complex business data requirements into data model relationships* uses their skillset to help the company's management interpret data and test hypotheses * manages xml schemas that are consistent with the data models * investigates analyze and recommend data formats and structure required to support applications that consume data from the hdfs platform * defines reporting data sets semantic layer kpis and metrics for tableau reports * routinely assembles datasets from core business systems and data repositories ( data warehouses and data marts)* utilizes bi reporting tools to explore data and produce outputs* tableau* business objects* power bi**business data communications*** keeps up to date on industry competitor and market trends; interprets internal external business challenges and recommends leading practices to improve products processes or services* shapes how own discipline department integrates with other disciplines departments and the broader function* communicates difficult concepts and negotiates with others to persuade a different point of view* actively solicits and listens to feedback to determine need for change or improvement* directs coach and guides development teams in applying data modeling techniques and the usage of data modeling and repository tools the above statement of duties is not intended to be all inclusive and other duties will be assigned from time to time ****job requirements:***** bs in mathematics economics computer science information management or statistics or related field * 7+ years’ data analytics experience in applicable business domain* business domain skills* strong business acumen and analysis skills to understand and quantify the steps required impacts and benefits associated with implementing or changing business process rules policy etc to impact performance* strong understanding with how the business works and how it is executed (people process and technology)* strong understanding of applicable business concepts processes and systems for business domain* ability to manage multiple priorities in a time sensitive production environment * data skills* ability to write custom processing jobs ability to leverage parallel processing on large datasets etc * ability to work across multiple datasets to assemble data* routinely called upon to extract data for analyst teams for routine or exploratory analytics* data knowledge* ability to clearly navigate complex data models to retrieve business information* in-depth knowledge of key data across multiple data domains and or business unit (perceived as a data sme)* ability to tutor co-workers (within and outside of organizational department) on various data domains* routinely validates how data elements are structured and confirms alignment with business definitions* expert knowledge on multiple source systems and the data they generate* data modeling* advanced ability to translate business requirements into logical data model* in-depth knowledge of multiple data models leveraged across the enterprise (either multiple business units or multiple data domains)* aligns closely with dba’s in the construction of the physical data model* reporting & bi* advanced ability to visualize data in the best representation given the data or analytics required* advanced ability to represent multivariate data within a single visual* thorough understanding of bi applications leveraged within cmfg and within the marketplace* ability to incorporate aesthetic principles into report scorecard development* communications* ability to communicate (written & verbal) up and down organizational structure regarding all tasks assignments* ability to communicate ideas at every level of detail from the most detailed business discussion to a senior leadership forum* strong skills in “demystifying” analytics and communicating benefits as opposed to technicalities* capable of leading development of presentation logic* strong presentation skills*li-sscuna mutual group’s insurance retirement and investment products provide financial security and protection to credit unions and their members worldwide as a dynamic and growing company we strive to create a culture of performance high standards and defined values in return for your skills and contributions we offer highly competitive compensation and benefit packages significant professional growth and the opportunity to win and be rewarded *please provide your work experience and education or attach a copy of your resume applications received without this information may be removed from consideration **job id:* r-002527*date posted:* 03 22 2018 here technologies the open location platform company enables people enterprises and cities to harness the power of location by making sense of the world through the lens of location we empower our customers to achieve better outcomes from helping a city manage its infrastructure or an enterprise optimize its assets to delivering drivers to their destination safely to learn more about here including our new generation of cloud-based location platform services visit http: 360 here com as our digital and physical lives become increasingly inter-connected our map of the world is rapidly changing with every connected device or sensor capable of generating and sharing its context and location it is data that will connect this complex new world the question is how do we make better use of that data and transform it into useful services for people and organizations – all in real-time?our answer is the here open location platform the open location platform big data processing team believes the value of a platform is directly related to the convenience of the platform our goal is to enable developers to easily conflate here reality index content with their own data using off the shelf batch or stream processing technologies we will achieve this by providing a self-serve apache spark and apache flink managed cloud service for developers to run their batch or stream location based data processing jobs main responsibilitieswork closely with product managers and engineers to design implement test and continually improve scalable web applications and services running on aws develop products using agile methods and tools develop commercial grade software that is user friendly and suitable for global audience support production issues both directly and indirectly with customers participate in design reviews code reviews of your work and the work of your peer engineers participate in architecture and design efforts work closely with other engineers and testers to deliver high quality software on time desired skills and experienceb s in computer science or equivalent 2+ years software development experience building scalable commercial web applications or services 3+ years experience with linux and linux based scripting 2+ years developing software with test automation or test driven development strong service level design and programming ability fluency with java or scala experience with general open source software including but not limited to spring hibernate jax-rs jdbc web containers postgresql mysql etc experience with linux shell maven git (or equivalent) modern version control and continuous integration continuous deployment understanding of data modeling techniques using relational and non-relational techniques working knowledge of bigdata technology including nosql storage solutions like cassandra hbase mongodb redis dynamodb complex large-scale processing systems like hadoop spark hive pig or event stream based processing platforms like storm spark streaming samza apex flink exceptional ability to troubleshoot complex distributed issues in a production environment highly entrepreneurial flexible and hard working – willing to go the extra mile or two to “get things done with high quality” preferred qualifications2+ years using aws in production software experience building highly available systems and operating 24x7 service here is an equal opportunity employer we evaluate qualified applicants without regard to race color age gender identity sexual orientation marital status parental status religion sex national origin disability veteran status and other legally protected characteristics #li-vp1 role : data modelerlocation : watertown ma duration : 6-12 monthsinterview : phone and f2fneed heavy er studio exp important skills technical er studio oracle data modeling experience sql pl sqlsoft able to hit the ground running quick learner strong communicationmust haves:5+ years of recent and continuous data mapping experienceer studio experiencejob description:education: bs in computer science related degree or equivalent level of relevant experience requiredexperience: minimum 5 years of data modeling experience proficiency in the following areas: data modeling (relational dimensional and hierarchical) generation of er models with er studiophysical database designs including sizing views triggers indexing strategies partitioning etc oracle11g 12sql pl sqldata profiling and data governance standardsstandard sdlc processes such as waterfall or agilegood communication skills - polished concise proactive timely and appropriate for the setting forum ability to work with moderate supervision detail oriented organized able to prioritize multitask meet deadlines and coordinate schedules with others regards kailash negi | kpg99 inc3240 e state st ext | hamilton nj 08619 609-681-2603 || kn@kpgtech com this position implements and maintains complex infrastructure in a mission critical environment and acts as the organization’s subject matter expert regarding environmental requirements support systems cabling capacity and space planning systems redundancy and fail-over disaster recovery and related policies and procedures essential duties and responsibilities:adherence to sykes policies on ethics and integrityincident management level 3 – provides support to include resolution and re-engineering and facilitates all electrical environmental requirementsserver application implementation - validate requirements; validate system designs; facilitates installation and server builds and prepares change implementation documentation supports development test and training environments develops technical specifications for procurement of systems equipment and services for advanced systems documentation – document all areas of the data center including but not limited to cabling power client applications and security change administration – acts as a key participant in change management or request to implement (rti) and submits change control documentation in accordance with sykes standards it physical asset management - responsible for it asset management for data center and provides reports to management and the sykes asset manager in accordance with sykes standards data center responsibilities - tracks plans and documents systems in the data center coordinates with internal departments to plan upgrades in data center facilities and oversees all activities associated with data center responsibilities include:validates documentation and directly oversees all installationsimplement data center standards and best practicestracks plans and documents systems in sykes-owned and co-located data centers coordinates with internal departments to plan upgrades in data center facilities and systemsprovides input for system moves within regional locationsoversees technical activities associated with all regional data centersimplements data center engineering standards and best practices and makes recommendations for improvements to senior it management and data center architect ensure compliance with legal and administrative regulations governing the operation of data centersoversees installations and conducts training on data center support systemsconducts inspections to ensure compliance with legal and administrative regulationsresearches technical specifications for procurement of systems equipment and services prepare weekly status reports on data center activities within defined responsibilitiesensure backup and disaster avoidance and recovery plans for the data centers are documented and updated regularlyensure cleanliness and proper layout of all equipment inside the data center as specified in the sykes standards documentationcreates data center status reports validates monitoring capabilities validate systems data tracks environmental system statusfacilitates drills to test staff response capabilitiesmay perform other additional duties and responsibilities as assigned education and or experience:bachelor’s degree in computer science or communications engineering required five (5) years data center network or systems administration in a large corporate it environment with mixed operation systems experience required certified data center technician professional certification required and any level 4 or higher certification from cnet network infrastructure idca dcis and dces referred experience in telephony and networking and active directory required; or any equivalent combination of related training education and experience required qualifications: to perform this job successfully an individual must be able to perform each essential duty satisfactorily the requirements listed below are representative of the knowledge skill and or ability required expert understanding of the following:data cabling standards and best practicespower and air distribution systemsdevelop power and air conditioning requirementsenvironmental regulations governing data center facilities and operations such as tia eai 942 and the uptime institutephysical and systems security issues impacting on data center and it room operations working knowledge of the following highly desirable:raritan poweriqsunbird dctrackenvironmental controls and hvac avaya and cisco experienceunderstanding of system administration – preferredphysical and systems security issues impacting data center operationsadvanced proficiency in the use of visio ms office suite and autocad demonstrated project management experience and ability to work multiple projects simultaneously excellent written and verbal communication skills requiredphysical demands:the physical demands described here are representative of those that must be met by an employee to successfully perform the essential functions of this job reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions while performing the duties of this job the employee is regularly required to speak and or listen the employee frequently is required to sit the employee is occasionally required to stand; walk; use hands to finger handle or feel; and reach with hands and arms the employee may occasionally lift and or move up to 25 pounds specific vision abilities required by this job include close vision and ability to adjust focus commitment to ethics and equal employment opportunity: sykes enterprises incorporated is firmly committed to conducting business in compliance with the letter and spirit of the law and other accepted standards of business conduct as reflected in the company's policies sykes is proud to be an equal employment opportunity employer sykes is committed to selecting developing and rewarding the best person for the job based on the requirements of the work to be performed and without regard to race age color religion sex national origin ancestry citizenship disability handicap marital status veteran status sexual orientation pregnancy genetic information gender identity and expression or any other basis protected by federal state or local law eeo disclaimer:the preceding position description has been designed to indicate the general nature of work performed; the level of knowledge and skills typically required; and usual working conditions of this position it is not designed to contain or be interpreted as a comprehensive listing of all requirements or responsibilities that may be required by employees in this position req ref no: rngass24-25location: alpharetta gaduration: 6 monthsdescriptiondescription:works on problems of moderate and varied complexity where analysis of data may require adaptation of standardized practices or precedent acts independently to identify and select appropriate methodologies work customarily and regularly (50% of time or more) requires the exercise of discretion and independent judgment normally receives general instructions on non-routine work assignments requires additional instructions and frequent guidance on new assignments decisions are made independently assigned multiple tasks or projects are generally reviewed after completion contact is typically coordinative in nature and involves exchanging detailed technical information may deal with both internal and external contacts demonstrates working knowledge in job-related functional area and of the business bachelor's degree or equivalent 3-5 years of experiencethe team member in this role will work on database systems critical to a great customer experience in or self service channel (web portals and mobile application) job title:jr data scientist jr database development & performance specialistjob duties:analyze architecture including relationship between database tables and relationships of business data storedanalyze database design to understand impact on performancerecommend target database solution based on business and or technical requirements – such as rdbms vs no sql databasecreate high-performing database designs code to retrieve large datasets and optimize system resourcestune poor performing sql to run at optimized levels; exercise intermediate to advance practices to identify poorly performing code and identify areas to improve itdemonstrate capability to write ddl and sql of low to medium complexityexecute database statements from java code including sql statements prepared statements and stored proceduresanalyze data automate monitoring of very large datasets to identify data anomalies early to minimize impact to customer facing data and write code to reconcileencrypt data to restrict visibility of sensitive informationmanage database updates across lower environmentstestingcreate and execute test plan to exercise functional and technical requirements release supportverifying database schema and data patches applied correctlymonitoring database activity and log activity during release testinginvestigate issues found during a release assess customer and system impactsmust have skills:development experience with- oracle 10g- postgresqldatabase development ide (toad oracle sql developer or other)sql ddl explain plan analysisexperience with all phases of the software development life cycle including system analysis design coding testing debugging and documentationexcellent written and verbal communication skillsteam player collaborativedelivers excellent quality of code and documentationcritical thinker good problem solvercan deliver assigned work independentlydesired skills:cassandra design development experienceelastic design development experiencejavaoom frameworkeducation certifications:bs in computer science information systems or other technical field is a plus but not requiredrequired shift:9am et - 6pm etviva is an equal opportunity employer all qualified applicants have an equal opportunity for placement and all employees have an equal opportunity to develop on the job this means that viva will not discriminate against any employee or qualified applicant on the basis of race color religion sex sexual orientation gender identity national origin disability or protected veteran status **your role:**are you a senior technologist with a passion for architecture to deliver best in class it platforms? do you have the ability to partner with the business to develop and execute a vision?we're looking for someone like you who can:• partner closely with the business to develop technology roadmap and strategy to support a rapidly growing primary research data business• engage with senior leaders throughout the firm to understand their needs and priorities; act as a senior face to the ubs investment research businesses • work alongside other architects to help design and architect the evidence lab platform - including• providing architecture input into key program decisions• define solution architecture• liaise with the business architect to proactively drive business it alignment• perform design reviews for programs of medium and high architectural significance• evolve the architecture for the application & functional domain towards target state• collaborates with architects across domains to develop the enterprise architecture• develop the architecture in a standard manner and promote reuse• co-designs the target state for the functional domain through collaboration with relevant business stakeholders• define roadmaps that outlines the implementation or decommissioning of key applications • define architecture standards and principlescommunicate all aspects of architecture to relevant stakeholders**what we offer:**together that’s how we do things we offer people around the world a supportive challenging and diverse working environment we value your passion and commitment and reward your performance keen to achieve the work-life agility that you desire? we're open to discussing how this could work for you (and us) **take the next step:**are you truly collaborative? succeeding at ubs means respecting understanding and trusting colleagues and clients challenging others and being challenged in return being passionate about what you do driving yourself forward always wanting to do things the right way does that sound like you? then you have the right stuff to join us apply now **disclaimer policy statements:**ubs is an equal opportunity employer we respect and seek to empower each individual and support the diverse cultures perspectives skills and experiences within our workforce **your team:**you’ll be working within investment bank - research and sales technology department working on the high profile technology program focused on a rapidly growing capability called evidence lab evidence lab is one of the most innovative and highly regarded teams in ubs investment bank the team gathers and analyses primary data from a variety of sources to enable ubs research analysts to make smarter investment recommendations to their clients they specialize in various analytical techniques including web harvesting geospatial social media market research and data science the technology team will be focused on building a best-in-class data and analytics platform integral to the strategy and growth plans of the business you can learn more about evidence lab here - http: about-neo ubs com content evidence-lab **your experience and skills:**· outstanding foundation in data technologies data modelling and database development (sql etl data warehousing rdbms)· knowledge of current leading cloud platforms (google azure aws)· prior experience with hadoop data lakes data pipelines data science platforms· passionate about data and analytic and any exposure with ai nlp machine learning is a plus· strategic and future thinking and someone who has investigated evaluated and helped make decisions on the purchase of new software products and then integrated and rolled it out to cover the business needs· collaborating with other business teams infrastructure and cto teams during evaluations and to introduce new products· working with the global support teams to ensure new products are smoothly adopted· proven track record in the architecture design implementation and delivery of large scale distributed systems in a financial services environment (e g experience working with lambda architecture)key deliverables· solution architecture· application roadmap· architecture principles & standards· design reviews· current state target state and roadmaps for functional domain· enterprise requirements· recommendations for simplification opportunities and application decommissioning· principles models and frameworks for application architecture**about us:**expert advice wealth management investment banking asset management retail banking in switzerland and all the support functions that's what we do and we do it for private and institutional clients as well as corporations around the world we are about 60 000 employees in all major financial centers in more than 50 countries do you want to be one of us?**your colleagues:****job reference #:** 170020br**business divisions:** corporate center**title:** solution architect - evidence lab**city:** weehawken**job type:** full time**country state:** united states - new jersey**function category:** information technology (it) at virgin orbit we launch the small satellite revolution accessing space for missions that were previously unimaginable and fostering new ideas and technologies to create a truly spacefaring civilization we deliver virgin’s legendary customer service by pairing technological advancements with a clear focus on our customers’ needs we build a workplace and a workforce for the future knowing that our decisions and investments of today will launch the industries of tomorrow we engage a diverse and inclusive community of dreamers and achievers in a shared mission to access space to improve life on earth at virgin orbit we boost imagination search all of our job openings on our career portal position summaryvirgin orbit (vo) is seeking a new member for our team to serve as enterprise data architect if you join us in this position you will lead the design implementation and support of vos business intelligence solutions these solutions will include an enterprise data warehouse data migration mechanisms data archiving mechanisms reporting framework(s) and actionable dashboards the enterprise data architect will be supported by and working closely with mes erp and plm systems leads as well as management supply chain manufacturing engineering shop floor and quality subject matter experts and it administrators this is a full-time position at virgin orbit’s aerospace facility in long beach california design and implement vo's own data warehouse to store and report on rocket’s digital data trail all the way from as-designed to as-flownrecommend design build and maintain data integrations to and from the data warehouserecommend and implement a data archiving solutiondesign build and maintain data reports and dashboardsdevelop ad-hoc reports for management and key business user metrics and kpistrain and coach vos systems and it teams as well as key business users on report and dashboard building techniqueshelp virgin orbit do our part to make the world a better placeconstantly learn new things—and share them with your team membersperforms other duties as assignedif you want to join us you’d better have all of these…degree in computer science or business fieldsminimum of 5 years of experience database and data warehouse designexpert knowledge in databases data warehouses data integration replication and mirroringexpert knowledge in sql (pl-sql or t-sql preferably both)problem solver able to work in multi-disciplinary teamsexcellent interpersonal coaching and leadership skillsmust be able to work all shifts and available for overtime as well as weekends when neededmust be able to sit and or stand for extended periods – 8 hours minmust be present on site 100% of the timeability and willingness to thrive in a fast-paced rapidly changing work environmentdriven highly motivated and committed to improvementability to add something unique and positive to our teamgenerally high level of all-around awesomenessmust be able to work all shifts and available for overtime as well as weekends when neededphysically able to handle items weighing up to 10lbs (unassisted)… and you should probably have a bunch of these too experience in ms sql server or oracle db or both is preferredexperience in elasticsearch restful search and analytics engine is a big plusknowledge of dremio data connectivity and analytics is a big plusknowledge of tableau or other similar reporting frameworks is a big plusexperience in aerospace aviation or automotive heavy equipment industries is a pluspassion for space exploration and a firm belief in the utility of affordable access to spaceability to thrive as a ‘one person team’itar requirementsto conform to u s government space technology export regulations applicant must be a u s citizen lawful permanent resident of the u s protected individual as defined by itar (22 cfr §120 15) or eligible to obtain the required authorizations from the u s department of state personal requirementsour idea of a happy life: build rockets work with brilliant people serve purposeful customers and open access to space sir richard branson’s belief is “if you get the perfect mix of people working for your company you have a far greater chance of success ” from the very start we have placed top priority on growing and fostering a team of creative diverse and talented women and men in the words of sir richard branson “we believe it’s better to have a hole in your team than an asshole in your team!”search all of our job openings— www virginorbit com virgin orbit is an equal opportunity employer; employment with virgin orbit is governed on the basis of merit competence and qualifications and will not be influenced in any manner by race color religion gender national origin ethnicity veteran status disability status age sexual orientation marital status mental or physical disability or any other legally protected status title: data architect location: miami fl duration: permanent hire compensation: coework requirements: us citizen gc holders or authorized to work in us overview:tekpartners has some of the most sought after information technology positions available as a reputable company in the it staffing industry you can trust us to place you in the right position we currently have an opportunity for a data architect in miami fl summary of position the data architect is responsible for executing the enterprise data strategy enforcing data governance design of data models and reporting framework and tools enablement they are also accountable for the standards and ongoing health of the data information architecture principal duties and responsibilities develop data information architecture solution data components high level baseline and targets document the data architecture as-is and ensure the to-be architecture meets the needs of all the functional business areas determine the effectiveness of data architecture design create necessary implementation migration plans and recommend new solutions as required participate in the data governance framework including processes for governing the identification collection and use of data to assure accuracy and validity work with it and the business to ensure consistency in data definitions and data usage across systems and tools act as the point of contact for definition of data requirements structure and taxonomies support in-solution development by leveraging infrastructure architecture standards practices assets and frameworks manage and direct the activities of up to two direct reports education and experience requirements: 5+ years of experience in data architecture design bachelor's degree in computer science or related field required advanced degree strongly preferred additional requirements: experience at working as a leader and collaborator in a team-oriented environment is essential can conform to shifting priorities demands and timelines through analytical and problem-solving capabilities reacts to project adjustments and alterations promptly and efficiently flexible during times of change ability to read communication styles of team members and contractors who come from a broad spectrum of disciplines persuasive encouraging and motivating ability to elicit cooperation from a wide variety of sources including upper management clients and other departments ability to defuse tension among project team should it arise ability to bring project to successful completion through organizational dynamics strong written and oral communication skills strong interpersonal and operational skill sets adept at conducting research into project-related issues and products – strong analytics skills must be able to learn understand and apply new technologies strong customer service skills and focus required ability to effectively prioritize and execute tasks in a high-pressure environment is crucial tenacious driven energetic and a high degree of professional integrity knowledge of data models data governance frameworks and data reporting tools deep understanding of the data and information architecture discipline processes concepts and best practices experience with data and application design strong technical understanding of data integration standards frameworks and best practices demonstrated ability to balance architectural theory with practical solutions robust skills in developing understanding business needs and requirements follow directions from a supervisor interact well with co-workers supervisors and management understand and follow posted work rules and procedures accept constructive criticism maintain professional appearance and demeanor at all times position requires reliable consistent and punctual attendance our benefits package includes:comprehensive medical benefitscompetitive pay401k retirement planand much more about tekpartners: tekpartners is one of the fastest growing private staffing firms in the united states we are a premier provider of highly qualified it talent workforce solutions and business intelligence solutions to many enterprise organizations across the nation as experts in the industry our team continues to match proven talent to the right job opportunity every day tekpartners is an equal opportunity employer providing efficient framework for real time analytics on ingested data or streaming data is central to our mission your role will be to define such framework leveraging workload specific hardware accelerators along with design and implementation of significant portions of the associated software skills education and experience required ms in computer science or equivalent degree 10+ years' experience in software infrastructure strong algorithmic and data structure background ability to write correct c or c++ code quickly experience with distributed systems desire to push the state of the art self-motivated independent and proactive additional success factors excellent communication and organizational skills to collaborate and stay in sync with other team members experience with design and implementation of large projects startup experience keywords nosql sql columnar data base olap oltp hdfs ddl dml llvm nlp spark hadoop big data analytics graph analytics your future evolves hereevolent health has a bold mission to change the health of the nation by changing the way health care is delivered our pursuit of this mission is the driving force that brings us to work each day we believe in embracing new ideas challenging ourselves and failing forward we respect and celebrate individual talents and team wins we have fun while working hard and evolenteers often make a difference in everything from scrubs to jeans are we growing? absolutely-56 7% in year-over-year revenue growth in 2016 are we recognized? definitely we have been named one of "becker's 150 great places to work in healthcare" in 2016 and 2017 and one of the "50 great places to work" in 2017 by washingtonian and our ceo was number one on glassdoor's 2015 highest-rated ceos for small and medium companies if you're looking for a place where your work can be personally and professionally rewarding don't just join a company with a mission join a mission with a company behind it data engineer - payor data serviceswe are looking for bright and energetic individuals to be a data engineer in our payor data services department this position involves the programming and analysis of healthcare data with an emphasis on payer data coding and data analytics position description:utilize sql programs to build metadata for various data feedsdevelop sas programs (once trained) to integrate and analyze payer data from multiple sourcesload and synthesize healthcare data from multiple sourcesreview data requirements design and implement logic to achieve data needsimplement and develop data quality control protocols and monitor their impactassist in designing programming and standardizing processes and reportsqualifications:competency in use of sql language and scripting to load sql server environmentsfamiliarity with sas and a desire to expand sas programming knowledge is a pluscompetency in data manipulation and analysis: accessing raw data in varied formats with different methods and analyzing and processing datamust be analytical detail oriented and possess desire to advance and grow personally and professionallyexcellent pc and database skillsability to multi-task and manage multiple projects with varying timelinesmust have a passion for data and healthcarebs ba or masters in computer science data analytics informatics or a comparable program with a quantitative emphasis 1+ years of sas and sql programming experience including sas macros proc sql and or enterprise guide*d*evolent health is an equal opportunity affirmative action employer we are a specialty retailer offering the very best of what’s next in fashion for men women and children since 1901 * **join us where it all began *whether you design clothes or business strategies crunch numbers lead projects or write code we have a place for you at our seattle headquarters and we think seattle is a pretty great place to live more than just rainy days and coffee seattle has it all — mountains and beaches arts and parks music and film it's made up of quirky neighborhoods award-winning restaurants and thriving industry come see for yourself!be part of a dynamic team of experienced data warehouse engineers responsible for some of our most valued information platforms for bi data analytics and executive dashboards we continually enhance and add value to our massively parallel processing database engines redshift and teradata on aws are our primary mpp platforms however our team is a big part of integrating these with hadoop and bi cubing solutions *a day in the life ** build automation tools around provisioning monitoring and bcdr for multi-region aws mpp database solutions * development of tools to improve workload management data replication resource management and user access * partner with development teams to review and tune applications to be as efficient as possible * provide oversight and quality control of sql being promoted to production be proactive and involved early in the development cycle * routinely monitor the performance and health of teradata on-premise and aws based platforms in addition to multi-region redshift clusters * provide coordination of activities in database software and hardware maintenance events * troubleshoot and tune data loading refreshing and replication issues involving data mover fastload multiload tpt redshift copy unload etc * support troubleshoot and enhance backup methods using arc and other methods * assist leadership in providing technical guidance and roadmaps on teradata and other related data management platforms * plan and support teradata hardware and software upgrades and maintenances * work with compliancy team in the design development and implementation of security policies and integrity controls including sox pii and pci audit requirements * understand and communicate principles and design tradeoffs for a mix of teradata and redshift databases * escalate and follow through on incidents created with the database vendors * develops and maintains database standards and documentation *you own this if you…** bachelor’s degree in computer science or related field or equivalent training and experience * teradata database administration: 5 years overall; 3 years on version 14 x or higher * amazon redshift: 2 years of development administration or performance tuning * amazon web services: 2 year of infrastructure development including ec2 elb route53 asg * at least 4 years with teradata tools and utilities: sql assistant bteq multiload data mover etc * at least 4 years with teradata performance optimization using pdcr dbql etc * at least 4 years with teradata sql tuning * at least 2 years with teradata 14 10 and sles 11 new features * teradata certification a plus * experience with data warehousing concepts including star and snowflake schemas conformed dimensions etc * experience managing active enterprise data warehouses and appliances * experience with teradata active system management (tasm) for workload management * troubleshooting and tuning complex database performance and replication issues * unix administration experience managing teradata systems (suse linux10) * solid verbal and written communicate skills to work well with a diverse community of customers including novice users developers data scientists and executive leadership * experience with cppt for capacity planning * physical database design and implementation experience * experience optimizing database objects to improve the experience using bi tools such as microstrategy tableau sas etc a plus * willingness to work a flexible schedule and be 'on call' to accommodate project deadlines and business requirements *we’ve got you covered *we offer a comprehensive benefits package that includes medical vision and dental coverage a fabulous merchandise discount an employer-matched 401(k) plan employee stock purchase plan and much more depending on your role we are an equal opportunity employer committed to providing a diverse environment this job description is intended to describe the general nature of the work employees can expect within this particular job classification it is certainly not a comprehensive inventory of all duties responsibilities and qualifications required for this job #li-jh2**job:** **technology***title:** *data warehouse database engineer***location:** *washington-seattle***requisition id:** *331065***other locations:** *united states-washington-seattle* **about neuberger berman technology:**collectively we are a unique group of technologists with broad capabilities operating as a flat organization working in a brand-new state of the art office located near central park we are independent and employee-owned; our decisions are not driven by outside shareholders or short-term market fluctuations technology is co-located with the business in our global headquarters technology doesn?t just enable our business but is taking accountability to create tools to give our business a competitive edge your work space is designed for collaboration with huddle scrum rooms audio visual connections across the floor soft seating and dedicated white-noise development spaces employee feedback is taken seriously and contributes to our constantly evolving office environment **summary:**we are looking for a hands-on data developer engineer with strong ms sql server experiences the candidate requires knowledge of rdbms modeling and development experiences as well as some data science technologies **responsibilities:**+ work closely with business stakeholders to understand their analytics and construct efficient and scalable algorithms to implement them+ set up strong foundational procedures guidelines and standards for data analytics and processing+ integrate new software tools for data analysis into the existing toolset+ build automated pipelines for developing testing and deploying data analytics applications+ conduct ad-hoc analysis and present results in a clear manner+ process clean and verify the integrity of data used for analysis**requirements:**+ bachelor degree or equivalent in computer science data science or engineering+ 5+ years of experience of ms sql server with knowledge of oltp and dimensional modeling design and development familiar with ms sql server performance optimization techniques + experience with t-sql and various of ms sql 2012+ advance programing features + experience with data integration and workflow tool (e g informatics powercenter sql server integration services) hands-on with large data pre-processing (etl) and data cleansing + experience with sql server data tool and familiar with various testing approaches+ experience with devops toolsets such as confluence jira tfs and git + experience working within an agile software development framework with strongly disciplined approach to software development+ some experiences with financial investment domain knowledge c# and python knowledge of investment risk is a plus + nice to have experience with reporting and analytical tools (e g business object ssrs tableau)+ nice to have experience with azure or aws is a plus + a team-player who is eager to learn with strong analytical and communications skills_neuberger berman is an equal_ _opportunity affirmative_ _action employer the firm and its affiliates do not discriminate in employment because of race color religion gender national origin protected veteran status disability age citizenship marital or domestic civil partnership status sexual orientation gender identity or expression or because of any other criteria prohibited under controlling federal _ _state or local law if you would like to contact us regarding the accessibility of our website or need assistance completing the application process please contact_ _onlineaccommodations@nb com_ _ _founded in 1939 neuberger berman is a private 100% independent employee-owned investment manager from offices in 31 cities worldwide the firm manages a range of strategies? including equity fixed income quantitative and multi-asset class private equity and hedge funds? on behalf of institutions advisors and individual investors globally with more than 500 investment professionals and more than 1 900 employees in total neuberger berman has built a diverse team of individuals united in their commitment to delivering compelling investment results for our clients over the long term our culture has afforded us enviable retention rates among our senior investment staff; it has earned us a citation as the second-ranked firm (among those with 1 000 or more employees) in the pensions & investments 2017 best places to work in money management survey after we had finished in the top three from 2013?16 http: www nb com this is a very cool company - become a spark analytics expert work with the best and brightest make a lot of money get equity that is expected to be very valuable in a short amount of time and work on some of the best federal analytics projects some details are listed below our client is a product and services firm pre-ipo that specializes in data science and analytics for the federal government and commercial market they are seeking several data engineers federal solutions architects to join their field engineering team the work is typically a 50 50 split between: - development building prototypes and proofs-of-concept and doing etl on customer data using technologies including databricks (apache spark platform) hadoop aws azure and pretty much anything else you can think of that could be used in cloud-based analytics - working with federal customers to understand their requirements present your prototypes and proofs-of-concept and help them figure out the best way of building an analytic solution that meets their needs the key skills you need coming in the door are development (in either java python or scala whichever you prefer) as well as sql etl experience is a huge plus our client will provide extensive training and hands-on mentoring in analytic technologies particularly spark (if you already have those skills that is even better ) salary will be great plus equity (which can lead to substantial additional compensation) full benefits plan etc work locations will be at one or more customer sites close to you (if you live near ft meade you can support that customer; if you live in northern va you can support customers close to your house) in many cases part of the work can be done remotely using the company's own clusters us citizenship is required security clearances up to and including fs poly can be kept active - they have lots of customers throughout civilian defense and intelligence agencies 11053 stanley reid & company is a technical and executive search agency that works with the most compelling firms in the us intelligence community and department of defense if this opportunity appeals to you and you meet the requirements please send us your resume if this is not the right fit for you please visit http: careers stanleyreid com to see a full list of our current opportunities in software engineering data science cyber security and cloud infrastructure candidate referral program: we will pay a $5 000 referral fee if you introduce us to a friend or colleague who we don't already know and who we are able to place at any of our clients within 1 year of the referral (referral fee paid when person reaches 6-month employment anniversary top secret sci if you are an experienced software engineer with a passion for designing and delivering big data solutions using cutting edge technologies want to be a part of exciting data democratization journey are looking for a collaborative team environment where you will have a wealth of opportunities to innovate and learn and possess intellectual curiosity then a career in customer data technologies in fidelity’s personal investment (pi) department may be right for you!at fidelity we love to use data and analytics to personalize incredible customer experiences and develop solutions that help our customers live the lives they want as part of our digital transformation we are making significant investments in innovative big data capabilities to support our customer obsessed culture and growing data science practice we are looking for a hands-on data technologist that can help us build our next generation cloud enabled data ecosystem the expertise we’re looking for* bachelor’s degree or higher in a technology related field (e g engineering computer science etc ) required * understanding principles best practices and trade-offs of schema design for both relational and nosql database systems * extensive experience in object oriented programming (java scala python) including the ability to code in more than one programming language our engineers work across many of them sometimes simultaneously* extensive experience in building distributed back-end systems * hands-on experience in implementing batch and real-time big data integration frameworks * solid understanding of data access and manipulation patterns hands-on experience using big data processing technologies such as spark * experience developing big data applications in a private or public cloud preferably aws * understanding of linux fundamentals and ability to interface with the os using system tools scripting languages integration frameworks etc * experience with devops continuous integration and continuous delivery (maven jenkins stash ansible docker) * experience executing projects in agile environments (kanban and scrum) the purpose of your roleour organization is seeking a big data engineer to be part of an exciting and fast-paced engineering team focused on designing and implementing large-scale distributed data processing systems using cutting edge cloud based open source and proprietary big data technologies in this role you will implement a variety of solutions to ingest data into process data within and expose data from a data lake that enables our data analysts and scientists to explore data in the ad-hoc manner as well as quickly implement data-driven models that generate accurate insights in an automated fashion this position is a critical element to delivering fidelity’s promise of creating the best customer experiences in financial services our teams are flat non-hierarchical structures which run on agile principles each team member has a technical leadership opportunity in the field of their competency we take pride in supporting our production systems and every customer issue is significant for us we make decisions on whether to maintain the legacy code or rewrite it but we never whine about it we conduct team-wide brainstorming and design reviews in which every voice counts and every opinion is challenged in our environment ideas compete but people collaborate every team member is proud of the solution that the whole team delivers and feels responsible for the team’s goals the manager provides frequent feedback to team members and so do they for the manager the skills you bring* you paid attention in those algorithms and data structures classes and mastered your learning on the job we build systems that operate at thousands of transactions second on an array of platforms if you cannot discern between linear vs exponential time complexity our customers will suffer * you love building things visualizing the design and figuring out the most economical ways of implementing those in code * you actually enjoy working in a highly collaborative environment you enjoy working on a good idea no matter whether it’s yours or someone else’s you passionately share your ideas and love it when other team members criticize and get involved with them but you can give those up if the team de-prioritizes them we work within close confines of our designers product owners operations in a collaborative open space environment cube-dwellers might find it hard to sustain-don’t be one * you have a passion and intellectual curiosity to learn new technologies and business areas you enjoy researching technologies and figuring out which ones work best for us you love prototyping and experimenting you find it ok when your project occasionally fails and you learn from that failure * you are confident in your ability to deal with ambiguity and work in fast-paced environment you prefer speed to perfection as long as imperfections don’t affect the customer * you have excellent communication skills both through written and verbal channels* you have the ability to understand and adapt to changing business priorities and technology advancementsthe value you deliver* suggesting and improving system designs by proposing your own ideas openly expressing disagreements with ideas that look wrong but accepting team’s judgement once it’s been made and executing on team’s decisions * helping set up priorities for the team * making decisions which are executed within the team but benefit the company as a whole in both short and long term * building and supporting robust mission critical applications that provide the best customer experience * promoting engineering simplicity and automation * exploring and leveraging new technology trends * mentoring team members and bringing them up to speed on latest big data technologies; promoting continuous learning * collaborating with internal and external teams to deliver technology solutions for the business needs* resolving technical roadblocks to the team and mitigating potential riskshow your work impacts the organizationcustomer data technology in pi supports the platforms that enable business users to collect and analyze the customer data needed to provide the best customer experience company overviewat fidelity we are focused on making our financial expertise broadly accessible and effective in helping people live the lives they want we are a privately held company that places a high degree of value in creating and nurturing a work environment that attracts the best talent and reflects our commitment to our associates for information about working at fidelity visit fidelitycareers com fidelity investments is an equal opportunity employer **job:** **database***title:** *big data engineer***location:** *ma-boston***requisition id:** *1709233* data is at the core of outreach's strategy it drives our customers and ourselves to the highest levels of success we use it for everything from customer health scores and revenue dashboards to operational metrics of our aws hosts and machine learning with the rapid customer and data growth outreach has experienced over the past 18 months we are stretching the limits of what our current platform can do and are building out our next generation data warehouse as a member of the infrastructure engineering team you'll be responsible for writing the software that manages our data warehouse and moves data into it you'll be extending the software of our etl system connecting to apis and managing warehouse schema using code you need to be excited to write sql and even more excited to write code that generates sql your daily adventures will include partner with other outreach teams to build a deep understanding of their data platform needs and use cases; use this input and your own customer obsession to guide your work design build and expand the data tables that support the self-service reporting needs of the outreach teams following table design best practices evolve our existing batch etl framework to incorporate new data flows and loading paradigms design and implement new streaming data transport frameworks to meet our real-time analytics needs design robust etl data flows that efficiently move data into the data warehouse and achieve our data availability and quality standards use sql and programming languages (such as ruby) to build data connectors to multiple input sources (including mysql salesforce zuora zendesk elasticsearch influxdb rest apis and more) establish standards procedures and dictionaries around the outreach data platform help our teams maximize the value of our data with hands-on expert support and brilliance our vision of youthe ideal candidate will have: true passion for analytics and be excited about modeling the tables and building data flows needed to provide deep insight preference to solve problems in a scalable manner with code rather than sql expert-level skills using sql to write ad hoc queries against complex database schema 3+ years professional software engineering experience building backend tools in a linux unix environment preferably using ruby experience designing and building tables and etl data flows for an enterprise data warehouse (e g redshift) familiarity with at least one business intelligence (bi) suite (e g domo tableau etc ) and how it interfaces with a data warehouse experience with data streaming (e g kafka kinesis) and big data processing (e g hadoop emr spark) undergraduate degree in an engineering computer science or related field about outreachoutreach is a communication platform built from the ground up to help people communicate more effectively today we focus on the sales organization helping them engage with their prospects and customers through email calling and linkedin communication flows to put it into perspective our customers spend the majority of their day living inside our software relying on it to be more effective communicators with their audience since our first days here at outreach learning about how people communicate we became obsessed with understanding our customers' problems as a result we've built the most loved product on the market and won the hearts and minds (and business) of some incredible organizations around the world in addition to our relentless focus on the customer we've received over $30 million in venture funding been listed on seattle business magazine's 100 best places to work been ranked #1 for 2016 sales automation acceleration software by ambition and were named to the 2016 forbes cloud 100-rising star list in the sales technology category we aren't slowing down anytime soon who we areoutreach is headquartered in seattle with sales offices throughout the us our team shares a mission and sense of purpose: to help sales teams succeed by better connecting with their customers we're a team of problem solvers and overachievers who value diversity of experience and perspective we like to be challenged we seek out people who are passionate about their craft and relentless in their pursuit of excellence we value grit building a company isn't easy so the ability to dig deep and tap into your inner super powers from time to time is important we've learned that a good attitude and great teamwork go a long way to overcoming the inevitable bumps along the way outreach is a rocket ship and we're excited to be building our future together if you join us you'll be challenged with big projects empowered to own them and enabled to crush it let's do this thing together why you'll love it herestarting on day #1 you'll get the opportunity to make a significant impact on our customers and our company being on a rocket ship means there is plenty of opportunity to grow your career every day we live in a company culture that values our people and delivers results we work as a team to achieve big things and celebrate those accomplishments quality coffee and tea drinks from our downstairs neighbor miir; great food and snacks in office we have an open floor plan with comfortable furniture (from couches to bean bags) to encourage opportunities to collaborate and adopt your own working style we offer competitive market salaries and benefits including 100% covered health insurance for employees 401k industry-changing parental leave policy and unlimited time off so much more bracketscience technology service bracket innovates at the leading-edge of clinical research data one exceptional service at a time from the advanced technology of our ecoa electronic clinical outcomes assessments flexible platform to the efficiency of our scalable and configurable randomization and trial supply management (rtsm) clinical irt solution to our science-focused rater training and quality assurance programs bracket does it faster better and with an eye on the future of our industry to achieve this we maintain an unwavering commitment to employing only the brightest most talented colleagues gleaned from a wide variety of professional fields if you are a creative problem-solver and have a hard-wired instinctual commitment to exceptional customer service we'd love to talk to you!position overview:the team lead clinical data management sql programming is responsible for working with the clinical data management sql programming staff to oversee the implementation support of bracket’s products and services additionally the position is responsible for process improvement; working with other data management staff and internal project team members to ensure on time and accurate deliverables; and working with customers to understand reporting requirements working under the direction of management this position mentors and manages clinical data management sql programmers in all aspects of the clinical data management sql processes and procedures essential duties and responsibilities: note: other duties may be assigned communicates effectively within a multi-disciplinary project team to complete assigned tasks on time and within budgetperforms all work in accordance with documented standard operating procedures (sops) working instructions and best practicesassists in developing and improving standard operating procedures (sops) working instructions and best practicesassist with other departmental initiatives and projects as requiredenhance the bracket business model by institutionalizing business processes implementing best practices and templates and seeking ways to work more efficientlyact as technical consultant to other departmentsresponsible for supervising a team of clinical data management sql programmers to ensure the integrity of data within sql deliverablesoversee the timely execution of client requestsresponsible for scheduling team work and manage a calendar of client deliverablesresponsible for estimating planning tracking documenting and reviewing the work performed by team members using departmental guidelines set communicate individual and team performance expectations and monitor measure the team’s and individuals’ work to ensure that project commitments and targets are met ensure assigned employees are trained and mentored as new processes and or technology are planned and introduced generate work instructions stepping through technical guidelines to be followed by clinical data management sql programmersserve as an escalation point for technical issues that cannot be resolved directly by employees engage staff and motivate the team to produce results actively coach and mentor team members provide constructive feedback and suggestions for improvement coordinate employee development plans and annual performance appraisals; perform salary bonus and promotion recommendations other project work and responsibilities as requiredconfiguration and developmentassume responsibility for the design development and on-time delivery of project specific configurations and customizationswork directly with the external and internal clients to prepare adapt or agree on all specificationscreate and document system design specifications and other technical documents as requiredunderstand and follow all coding standardscreate robust well documented codecreate database objects as requiredcomplete unit testing and peer review documentation as requiredsupport test script development and performance of user acceptance testing support all phases of testing by efficiently diagnosing and resolving defectsintegrate implemented code and database objects into release application participate in all post live study changes including risk assessment specifications testing and interactions with study team for shared activitiesrecommend design and implement on-going application and architectural improvementsapplication supportdiagnose and resolve defectsidentify areas where applications are impacting the underlying data work internal team to define test cases required to determine impact to datacomplete all required change control documentation including updates to requirements design and other technical documents as requireddeploy resolution to testing and production environment as requiredconsult with internal team on defects that impact core producteducationba bs degree or equivalent work experienceexperience and competencies5+ years of professional experience in the programming field experience with some or all of the following technologies:microsoft sql server2008 2012 mysql or other relational databases including stored procedures views and triggersmicrosoft sql server reporting services or other reporting toolteam foundation server visual source safe subversion or other source control productexperience developing enhancing and customizing configurable applications is desirabledomain experience in any of the following is desirable:mobile device applications include smartphone and tabletelectronic data capture applicationsclinical trial management systemsexperience estimating development and support tasksproficient in access excel and other office software technologies and applications experience with standard data mining and data presentation techniques familiarity with 21 cfr part 11 or experience in a regulated environment desirablepharmaceutical biotech industry experience highly desirablestrong verbal and written communication skills ability to complete high quality technical documentationdemonstrate extreme attention to detail and organization in all aspects of workability to quickly learn and apply new skills procedures and approachesdemonstrated ability to meet very short deadlines & multi-task in an extremely fast-paced work environment with little direct supervision proven ability to work both independently and in a team-oriented environment providing back-up support to team members & establishing maintaining effective work relationships with co-workers within and across functional areaspreviously demonstrated proactive and positive approach to tasks and projects overall as well as to the types of scheduling & process changes that are inherent in a fast-paced businesswe offer a fully comprehensive benefits program with medical dental vision company paid life insurance short and long term disability great paid time off program that starts with 20 days of accrual per calendar year; great 401k plan with company match that is 100% vested immediately paid parental leave and other competitive benefit programs great salary and reward and recognition programs eeo minorities women veterans disabled bracketscience technology service bracket innovates at the leading-edge of clinical research data one exceptional service at a time from the advanced technology of our ecoa electronic clinical outcomes assessments flexible platform to the efficiency of our scalable and configurable randomization and trial supply management (rtsm) clinical irt solution to our science-focused rater training and quality assurance programs bracket does it faster better and with an eye on the future of our industry to achieve this we maintain an unwavering commitment to employing only the brightest most talented colleagues gleaned from a wide variety of professional fields if you are a creative problem-solver and have a hard-wired instinctual commitment to exceptional customer service we'd love to talk to you!position overview:the clinical data management sql programmer supports the implementation of bracket’s products and services to meet customer’s needs this position is responsible for working with other data management staff and internal project team members to ensure on time and accurate deliverables; implementing assigned programming analytical tasks; performing unit testing; documenting his or her work according to accepted quality principles; and supporting testing essential duties and responsibilities:note: other duties may be assigned communicates effectively within a multi-disciplinary project team to complete assigned tasks on time and within budgetperforms all work in accordance with documented standard operating procedures (sops) working instructions and best practicesassists in developing and improving standard operating procedures (sops) working instructions and best practicesassist with other departmental initiatives and projects as requiredenhance the bracket business model by institutionalizing business processes implementing best practices and templates and seeking ways to work more efficientlyconfiguration and developmentassume responsibility for the design development and on-time delivery of project specific configurations and customizationswork directly with the external and internal clients to prepare adapt or agree on all specificationscreate and document system design specifications and other technical documents as requiredunderstand and follow all coding standardscreate robust well documented codecreate database objects as requiredcomplete unit testing and peer review documentation as requiredsupport test script development and performance of user acceptance testing support all phases of testing by efficiently diagnosing and resolving defectsintegrate implemented code and database objects into release applicationparticipate in all post live study changes including risk assessment specifications testing and interactions with study team for shared activitiesrecommend design and implement on-going application and architectural improvementsapplication supportdiagnose and resolve defectsidentify areas where applications are impacting the underlying data work internal team to define test cases required to determine impact to datacomplete all required change control documentation including updates to requirements design and other technical documents as requireddeploy resolution to testing and production environment as requiredconsult with internal team on defects that impact core productskills & competencieseducationba bs degree or equivalent work experienceexperience2+ years of professional experience programmingexperience with some or all of the following technologies:microsoft sql server2008 2012 mysql or other relational databases including stored procedures views and triggersmicrosoft sql server reporting services or other reporting toolteam foundation server visual source safe subversion or other source control productexperience developing enhancing and customizing configurable applications is desirabledomain experience in any of the following is desirable:mobile device applications include smartphone and tabletelectronic data capture applicationsclinical trial management systemsexperience estimating development and support tasksfamiliarity with 21 cfr part 11 or experience in a regulated environment desirablecompetencies & personal attributesstrong verbal and written communication skillsability to complete high quality technical documentationdemonstrate extreme attention to detail and organization in all aspects of workability to quickly learn and apply new skills procedures and approaches demonstrated ability to meet very short deadlines & multi-task in an extremely fast-paced work environment with little direct supervisionproven ability to work both independently and in a team-oriented environment providing back-up support to team members & establishing maintaining effective work relationships with co-workers within and across functional areaspreviously demonstrated proactive and positive approach to tasks and projects overall as well as to the types of scheduling & process changes that are inherent in a fast-paced businesswe offer a fully comprehensive benefits program with medical dental vision company paid life insurance short and long term disability great paid time off program that starts with 20 days of accrual per calendar year; great 401k plan with company match that is 100% vested immediately paid parental leave and other competitive benefit programs great salary and reward and recognition programs eeo minorities women veterans disabled department: investments position summary: the developer will be responsible for assisting implementation of a data management solution this includes the analysis design development testing installation and initial maintenance of the solution and acting as interface with customers and other developers to determine the most efficient and cost-effective approach to meeting business requirements the project will utilize a variety of hardware and software technologies and may include new code construction modifications to existing modules and package implementations this position will apply disciplined software development processes and utilize leading edge technologies to engineer and implement solutions to business problems he she will display technical and functional competence in data management standards and uphold client's commitment to ethical business practices candidate responsibilities: collaborate with team members it and software consultants to ensure functional and business specifications are converted into flexible scalable and maintainable solutions designs develop and implement data warehouse strategies that utilize data marts and or data warehouse systems to enhance business processes and manage business intelligence design and manage data models for applications metadata tables and views ensuring system data and reporting stability security performance and reliability establish analytic environments required for structured semi-structured and unstructured data develop data quality metrics that identify gaps and ensures compliance with investment and enterprise wide standards active in role in quality assurance (qa) and user acceptance testing (uat) for data solution initiatives by providing testing support attending team meetings and documentation functional skills: 5+ years of overall experience in investments and insurance data and systems management data science programming and information systems experience with and implementation of any enterprise data management tools for investments (markit edm goldensource edm etc) including builds of various in-bound and out-bound feeds (intex trepp markit bloomberg epam etc ) understand and support business data needs; understand data layout and resulting business impact track record of collaborating with it and vendor teams to provide technical solutions and process improvements delivering end-to-end products processes on schedule and budget as per business requirements and sdlc standards including addressing performance and ad-hoc requests experience with data warehouse and bi systems working knowledge of financial investments works effectively with associates across business and management teams and with corporate adapt to changing business priorities and environments strong agile and waterfall experience; ability to break down complex business requirements into small user stories position qualifications: bachelor degree in management information systems computer science or a related field unix scripting experience strong development experience: java sql vba r python or similar programming languages amazon web services experience tableau experience (or any other visualization tool) strong knowledge of excel strong mathematical analytical and communication skills this position will be primarily located in our new york city ny (headquarters) office princeton information is one of the nation's top five privately-held it consulting firms in business since 1985 princeton information services a clientele of primarily fortune 500 companies nationwide with annual revenues over $120 million princeton information operates across the us from multiple regional offices our commitment to our consultants as a privately held company princeton information is solely committed to the success of clients and consultants - not to any shareholders pi's success is grounded in the relationships we build with our consultants we seek the best people; provide career path counseling; as well as the most challenging opportunities in business and in it as part of its culture of loyalty and commitment to its consultants princeton information is committed to doing all we can to ensure our consultants have the best possible search placement and work experience possible our services working with one princeton recruiter will gain you access to over 500 open requirements with the top clients in the us across all industries (finance insurance pharmaceutical commercial telecom media manufacturing) nationwide our local recruiters have in-depth knowledge of our clients and opportunities they will work with you to find you the best possible opportunities for you and your career our relationships our relationships with our clients as well as our consultants are critical to our success! we have a robust sales organization that ensures that princeton has the inside track on what attributes a person needs in order to be successfully placed and engaged at our clients we know the technical and non-technical skills that our clients are looking for and we ensure that you are educated about the client prior to your interview with them princeton is committed to going above and beyond to ensure that each meeting you have with a client is a successful one! - provided by dice access analysis analytical applications business intelligence business requirements computer consulting data warehouse developer development edm engineer excel hardware it java management mathematical metrics performance programming project python qa quality quality assurance sdlc security software sql system systems telecom testing unix vba web in order to apply for a position at lumeris you must create an account using your email address and a password of your choosing this account will allow you to receive notifications each step of the way through the job application process with these updates you’ll never have to wonder where you are in the process additionally we can easily send pertinent documents to you for your review once you create the account you may apply to any position you feel is a good fit without having to re-enter information thank you for your interest in lumeris **position:**manager technology - data operations**position summary:**the technology manager will be a member of the lumeris data operations team the primary responsibilities of the role will be to lead and mentor a team of analysts in implementation of company and internal projects and analysis and resolution of data issues this position is one which requires the ability to develop and mentor others to achieve more as team tasks include but are not limited to mapping extraction transformation and validation of data from the various healthcare and financial data sources **job description:****role and responsibility**+ manage the work of a team as it interacts with clients on implementation activities such as requirements gathering data transformation quality improvement and change management+ co-ordinate with other managers and teams on client implementations and support efforts to ensure clear communication and cross functional participation+ manage the development and implementation of data quality standards policies procedures and data metrics+ act in the role of subject matter expert within the areas of technology used by the team and patiently pass this expertise on to others+ recommend changes to policies and establish procedures that affect immediate organization(s)+ responsible for coaching team members to improve their skillset and grow their careers+ develop and foster a culture that is focused on providing value to customers and a mindset of continual improvement+ some travel will be required**experience qualifications and education**+ bachelor’s degree in healthcare management nursing computer science data science mis or equivalent job related experience+ requires at least 10 years of experience in it and at least 5 years of management lead experience over teams+ proven experience managing a consulting or operations team including individuals in leadership roles in a high paced enterprise environment+ experience with leading teams in client facing businesses analysis and or consulting engagements+ excellent problem-solving communication and time management skills+ detail oriented; able to work independently and set priorities+ experience developing client facing presentations representing the results of the analysis in a clear and concise manner+ excellent communication skills and the ability to work closely with customers and third parties to complete large system integration projects**benefits:**in addition to competitive salaries challenging work assignments and developmental opportunities lumeris offers employees a comprehensive benefits package to include medical dental vision life insurance short-term and long-term disability paid time off (pto) matching 401k and tuition assistance for more information on lumeris careers and to apply for positions please check out our web site: www lumeris com careerslumeris is an eeo aa employer m f v d to stay connected with exciting news and the latest job opportunities from lumeris follow us on twitter: @lumerisjobs **location:**st louis mo**time type:**full time**status:**xl - ft**join our growing team!**lumeris serves as a long-term operating partner for organizations that are committed to the transition from volume- to value-based care and delivering extraordinary clinical and financial outcomes lumeris enables clients to profitably achieve greater results through proven playbooks based on collaboration transparent data and innovative engagement methodologies lumeris offers comprehensive services for managing all types of populations including launching new medicare advantage health plans commercial and government health plan optimization and multi-payer multi-population health services organizations (phsos) currently lumeris is engaged with health systems provider alliances and payers representing tens of millions of lives moving to value-based care over the past seven years we have tripled in size to more than 800 employees and built the only solution on the market with our proven outcomes for the past six years essence healthcare a long-standing lumeris client has received 4 5 stars or higher from the centers for medicare and medicaid services (cms) essence healthcare was lumeris’ pioneer client and has been leveraging lumeris for more than a decade to operate its medicare advantage plans which serve more than 60 000 medicare beneficiaries in various counties throughout missouri and southern illinois in 2018 lumeris was named best in klas for value-based care managed services in the area of client-reported impact on the triple aim by klas research this was the third year in a row lumeris received the award and it has only been given for three years as the industry’s most reliable and effective partner for developing population health management solutions our success is driving tremendous growth in our company join us today in making a real difference in how healthcare is delivered!**why join lumeris?**at lumeris you will be part of team that is focused on solving the nation’s healthcare problem and you will be able to contribute to our purpose our environment is fast-paced change-oriented and focused on growth and employee engagement at lumeris we know that talent is best utilized when given the opportunity to succeed that is why we have removed the boundaries that inhibit success and focus on fostering an environment that allows employees to utilize their talents employee perks why you will love being part of the navy federal team:*competitive compensation with opportunities for annual raises promotions and bonus potential*best-in-class benefits! (7% 401k match pension plan tuition reimbursement great insurance options)*on-site amenities include fitness center wellness center cafeteria etc at pensacola fl; vienna va and winchester vacampuses*consistently awarded top workplace*nationally recognized training department by training magazine ind123*an employee-focused diverse and service-oriented workplace environmentbasic purpose to support information system division and the enterprise by providing comprehensive architectural leadership in translating navy federal's business vision and strategies into effective it and business capabilities through the design implementation and integration of it systems in the big data domain the big data architect will be responsible for guiding the evolution development and governance of the navy federal data architecture with a specific focus on the big data architecture required (individual role):• experience in data architecture and bi disciplines and deeper understanding of data warehousing bi advanced analytics concepts in large organizations• experience in developing enterprise data strategy and roadmaps to meet business goals and strategies• experience in using various data modeling techniques and their appropriate usage in data integration• extensive experience in architecting and designing data architecture solutions using hadoop ecosystem tools and technologies like hive impala hbase spark mapreduce scala and solr• proficient in data ingestion pipeline process data cleansing and metadata management on big data platforms• experience in architecting shared data assets like big data data discovery platforms master data metadata ods dw and data marts and defining data strategy• create artifacts to support architecture governance including standards advisories research papers reference models and reference architectures as needed or assigned• proficient in service oriented architecture and related methodologies• experience in providing technical and data leadership to the application development group it and the enterprise • experience in understanding enterprise-wide view of the business and its relationships to information technology• broad knowledge of zachman or togaf architecture frameworksdesired (individual role):• togaf certified architect or other equivalent certification• professional affiliation with dama or tdwi• experience with machine learning algorithms ai and predictive modeling techniques• experience with cloudera enterprise data hub (edh) governance and data pipeline management• ability to perform multi-tasking with minimal supervision • advanced degree in mis computer science statistics marketing management finance or related field• prior knowledge of financial industries and large banks• experience in enterprise java (j2ee) technologies web services soap xml uml information models data flow diagrams multi-threaded software applications object-oriented design and analysis methodologieshours:monday-friday 8:00am-4:30pmequal employment opportunity navy federal values celebrates and enacts diversity in the workplace navy federal takes affirmative action to employ and advance in employment qualified individuals with disabilities disabled veterans armed forces service medal veterans recently separated veterans and other protected veterans eoe aa m f veteran disabilityreqnumber: 35133-1a big data etl developer**chevy chase maryland united states** at https: geico referrals selectminds com jobs 7060 other-jobs-matching location-onlyinformation technology at http: geico referrals selectminds com landingpages information-technology-opportunities-at-geico-19r0003676requisition #apply for jobshare this jobsign up for job alerts**_job description_**the ds&t (data science and transformation) team is seeking a highly motivated team-oriented and process-driven big data etl developer who wants a challenging position to showcase their skills qualified candidates should possess 3+ years of professional experience on a hadoop platform the selected candidate will be part of a group of etl and report developers responsible for creating data marts used for enterprise reporting and data visualizations **_interested in joining our innovative team? if so read on!_****required qualifications:**+ bachelor's degree in a computer related field + 3+ years of hands-on experience in hadoop eco system (hdfs yarn mapreduce oozie and hive)+ 1 year of hands-on experience in spark core and spark sql+ 5+ years of hands-on programming experience in either core java or spark+ 1 year of hands-on experience in hbase cassandra any other nosql db+ understanding of distributed computing design patterns algorithms data structures and security protocols**\#li-cg1**\#li-cg1**_about geico_**for more than 75 years geico has stood out from the rest of the insurance industry! we are one of the nation's largest and fastest-growing auto insurers thanks to our low rates outstanding service and clever marketing we're an industry leader employing thousands of dedicated and hard-working associates as a wholly owned subsidiary of berkshire hathaway we offer associates training and career advancement in a financially stable and rewarding workplace our associates' quality of life is important to us full-time geico associates are offered a comprehensive total rewards program* including:+ 401(k) and profit-sharing plans+ medical dental vision and life insurance+ paid vacation holidays and leave programs+ tuition reimbursement+ associate assistance program+ flexible spending accounts+ business casual dress+ fitness and dining facilities (at most locations)+ associate clubs and sports teams+ volunteer opportunities+ geico federal credit union* benefit offerings for positions other than full-time may vary geico is an equal opportunity employer geico conducts drug screens and background checks on applicants who accept employment offers **_how to apply_**click "apply for job" to complete your application you will need an active email address and phone number please upload your resume preferably as word doc files or pdf once you begin your application you can save it and access it later your application should include any work and or internship experience from at least the past five years our client is looking for a senior data architect to join their team this individual will lead the data architecture for large initiatives with both physical and logical data models to support use cases required skills qualifications:expert skills in data architecture data models and data systemsability to define data layer systems and solutionsbig data technology experiencemdm expert knowledgesoa expert experienceexperience with aws is a plusbachelor degree in information systems computer science or other technical degree (math physics engineering) master' s degree preferred10+ years experience architecting and managing data systems big data engineer architectlocation: boston maduration: 6 month contract to full time permdescription:define and express the big data processing architecture and vision for business and technical strategies design and develop big data systems supporting internal users and analytics requirements train and mentor other technical staff and business as appropriate provide architectural leadership to development and delivery teams work with cross-functional it groups to promote open communication and teamwork define and develop strategy for oracle to hadoop migration must haves:10+ years* of professional experience with core java3+ years* experience with big data technologies such a kafka spark and or hadoopadvanced experience with sql programming for oracle databasesbachelor*s degree in computer science or a related fieldnice to haves:5+ years* experience with shell scripting2+ years* experience with scalaexperience in the healthcare industryadditional requirements:must be able to pass a background check and drug test - provided by dicecore java kafka spark and or hadoop sql programming our client is seeking an individual with advanced experience re-implementing large scale enterprise asset management solutions including complex integrations to erp and ancillary systems the incumbent must have a successful history of leading data conversion planning development testing and execution that includes proven experience in maximo version 7 required qualifications:· ten (10) or more years of experience in data analysis and conversion with greenfield enterprise system data migration testing and validation· a bachelor’s degree in computer science information technology or a closely related field; significant experience may be substituted for the educational requirement· must have experience with generating precise technical design and field level mapping documents supporting inventory asset and work management records· strong relational database skills migration and transformation scripting expertise and a solid etl skill set· proven experience in a minimum of three end-to-end enterprise level data migration projects· able to resolve highly complex issues where analysis of situations or data requires an in-depth evaluation of variable factors including the ability to select methods techniques and tools to ensure best results· ability to work effectively on a complex development and deployment project with tasks timelines and workflows· ability to work with cross-functional teams to address complex business or systems issues· flexibility to work non-traditional business hours as needed to achieve deployment on time and on schedule· capability to work with others from diverse backgrounds· self-motivated and works both independently and as part of a team· proficiency in microsoft suite (outlook visio word excel and access)· detail-oriented and excellent follow-up skills· strong analytical skills· demonstrated effective communication and interpersonal skills · the flexibility to orient and work at multiple na business locations as needed· expertise and development experience with all related technologies including xml xslt pl sql jms oracle maximo mif and rest api’s· capable of defining and testing core maximo and maximo custom solutions aligned to ricefw in the form of migration strategy planning testing and recoverypreferred qualifications:· five (5) or more years of experience working with ibm maximo solutions· capable of maximo front-end data validation across enterprise applications· experience in maximo integration with sap ariba· work history that includes utilizing informatica· professional experience following sdlc and agile methodologies etl hadoop data and integration engineer (contract to hire) job location: glen allen vajob description:amitech is seeking a data and integration engineer this position will lead agile software development efforts as a technical leader will be a hands-on data and integration engineer who can write quality code assist with problem solving root cause analysis and trouble shooting the candidate will also be responsible for ensuring data governance and best practices are embraced the ideal candidate will have the following qualifications:preferred skills:5-7 years of experience building and deploying software using hadoop ecosystemhands on aws experience building and managing etl jobs in cloudprogramming language preferences: python java sparkexpert in data integration with rdbms big data hadoop data lake concepts and has relevant experience with various os network and storage conceptsfamiliarity with middleware etl products and willingness to learn new tools at a rapid pacestrong data architect that can work with it architect and it analysts to break down a complex system into smaller componentsexperience performing data modelsstrong understanding of emr conceptseducation:bachelor’s degree in computer science or related fieldabout usamitech a leading healthcare analytics and strategy consulting firm leverages the true value in data to help healthcare systems and insurers lower costs improve quality of care and achieve better business outcomes reasons to partner with usamitech is a rapidly growing organization focused on our employees and we’re committed to offering opportunities to the best in the industry our diverse and innovative approach to everything we do means we’re looking for the groundbreakers and the pioneers—people who think differently and create the future the manager of master data management (mdm) is responsible for establishing an mdm vision and associated set of processes in support of its erp system this role is key in developing governance and gaining buy-in from various stakeholders across the organization this is a high visibility role which is crucial for the growth and advancement of our business the manager of mdm will work closely with business and technical experts to establish the corporate practices for mdm and master data governance responsibilities span from the creation of policy and system recommendations through implementation of recommendations to the monitoring analyzing and remediating of master data quality concerns will work in close partnership with cross-functional teams across the organization to develop launch refine and scale master data programs and activities this position implements processes and governance to ensure data integrity data standards standardization and change management the individual will be responsible for the integrity of master data this role involves data creation governance project management process development as well as regular audits this role is a highly collaborative position with many key stakeholders of master data including but not limited to: it r&d quality finance sales marketing and supply chain understanding processes dependencies managing deadlines problem solving and managing cross functional relationships are crucial to this position conceptualize and document comprehensive master data management and governance strategies and processes which are suited to isagenix systems capabilities and business processes as well as the roadmap which will meet our future needs serves as a mdm thought leader deeply understand the corporate system landscape which includes an erp system third-party cloud systems an e-commerce engine data warehouses quality management systems and external data sources these systems span a wide-range of technologies and have varying degrees of sophistication work closely with all levels of the organization including: presenting to and educating the business on master data issues and opportunities communicating the impact on complex integrated business processes and establishing the master data quality metrics and targets work with the business unit representatives (finance supply chain marketing etc ) and it to set strategic master data direction that will enable strategy and to resolve issues with respect to master data and ensure master data aligns with overall business objectives implement and improve master data policies and procedures in alignment with enterprise information management policies to focus on operational efficiencies through data quality and standardization partner with erp support teams and data owners to design and develop workflows and processes procedures to enable efficient data setup provide recommendations on industry leading and best practices services and systems used for the improvement of data quality troubleshoot and resolve master data issues that relate to transactions affected by master data or other operational issues serve as the strategic manager for data within the business and in that capacity serve as the principal voice for data quality integration and governance define develop and measure data quality analytics and key performance indicators (kpis) drive continuous productivity by ongoing challenge to status quo to discover and deliver improvement navigate comfortably through ambiguity to manage and balance uncertainty and risk to make effective decisions bachelor’s degree in a quantitative major such as supply chain business analytics computer science information systems industrial engineering or similar business area 8 years of experience in gathering business requirements and or deployment of erp systems (such as ms dynamics ax sap oracle etc ) focusing on master data management in a large enterprise advanced skills with microsoft office software including word powerpoint excel and outlook deep understanding of how master data is used throughout a supply chain – from planning to ordering materials finished goods production transportation storage at dcs all the way through shipment to customer and subsequent reporting proven attention to detail required must be analytically minded and a methodical problem solver expert understanding of master data management strategies mdm services and capabilities metadata management data governance and data quality solutions experience in a lead role on mdm projects ability to develop and implement data governance and standards strong written and verbal communication skills including the ability to quickly synthesize data and develop recommendations strong analytical prioritizing interpersonal problem solving presentation and planning skills proven ability to complete projects and achieve results in an ambiguous work environment ability to establish and articulate a vision set goals develop and execute strategies and track and measure results strong collaborator - ability to operate with cross-functional teams (e g finance sales marketing r&d quality supply chain etc ) mba ms in business administration supply chain business analytics computer science information systems experience working with complex global inventory networks 10+ years of experience gathering business requirements and or deployment of erp systems (such as ms dynamics ax sap oracle etc ) focusing on master dara management in a large enterprise experience working with primarily third-party manufacturing and 3pl operations experience in the food or pharmaceutical industry technical knowledge and understanding of master and reference data elements processes and organizational support models **job description**ibm global business services (gbs) is a team of business strategy and technology consultants enabling enterprises to make smarter decisions and providing unparalleled client and consumer experiences in cognitive data analytics cloud technology and mobile app development with global reach outcome-focused methodologies and deep industry expertise ibm gbs empowers clients to digitally reinvent their business and get the competitive edge in the cognitive era in over 170 countries we live in a moment of remarkable change and opportunity data and technology are transforming industries society and even the workplace—by creating professions that didn’t exist before the emergence of data cloud social and mobile ibm global business services is a leader in this transformation where you can make a difference at ibm you can be part of a team that strives to make the world work better and to help our clients succeed through research analytics and technology join us and discover what you can make of this moment what will you make with ibm? ibm com jobs**this is an opportunity within our healthcare and life sciences sector for a senior solution architect** to be part of building healthcare analytics solutions for our customers we are seeking an individual with experience in solution concept definition and product requirements definition experience in the healthcare industry including deep understanding of clinical data workflows application design and delivery data warehousing data analytics benchmarking and reporting will be a huge plus **location:** anywhere in the us; will require travelthis individual will:+ facilitate strategy sessions and contribute to competitive information + work closely with product management and product development to interview research analyze and derive business functional and technical requirements + develop an in-depth understanding of the product and underlying data assets and models methods + define the solution concept design for custom solutions leveraging platform components reusable modules and existing product capabilities create logical data models for the solutions + clarify and translate market requirements into functional and non-functional specifications and collaborate in the product development process assist in resource planning process make important tradeoff recommendations between functionality resources and timing drive the user experience needs of the solution consistent with company standards + lead the design and testing phases including design and testing of prototypes and orchestration and sign-off on acceptance test results to ensure design integrity + work with cross-functional teams -including sales take a lead role in creating custom solutions proposals collaborate with operations and account teams to understand documentation and training needs + work with the appropriate teams to deliver the needed materials + assist with implementations when needed + gain an understanding of implementation issues and product improvement opportunities + provides leadership to business analysis discipline and best practices + this person will need to know ibm's unified data model for healthcare solution (udmh) **benefits**health insurance paid time off corporate holidays sick leave family planning financial guidance competitive 401k training and learning we continue to expand our benefits and programs offering some of the best support guidance and coverage for a diverse employee population + http: www-01 ibm com employment us benefits + _https: www-03 ibm com press us en pressrelease 50744 wss_**corporate citizenship**with an employee population of 375 000 in over 170 countries amazingly we connect collaborate and care ibmers drive a corporate culture of shared responsibility we love grand challenges and everyday improvements for our company and for the world we care about each other our clients and the communities we live work and play in!+ http: www ibm com ibm responsibility initiatives html+ http: www ibm com ibm responsibility corporateservicecorpsan**required technical and professional expertise**+ at least 4 years hands on experience with ibm’s udmh solution or ibm provider data model+ at least 7 years healthcare payer or provider market experience+ at least 5 years experience development experience with major etl tool (data stage informatica ab initio)**preferred tech and prof experience**+ at least 10 years experience data stage implementation experience+ at least 10 years data modeling mapping experience+ at least 4 years degree in healthcare or it related field+ at least 5 years sql development experience+ at least 5 years data modeling mapping experience**eo statement**ibm is committed to creating a diverse environment and is proud to be an equal opportunity employer all qualified applicants will receive consideration for employment without regard to race color religion gender gender identity or expression sexual orientation national origin genetics disability age or veteran status ibm is also committed to compliance with all fair employment practices regarding citizenship and immigration status as a management services organization (mso) continuum health delivers proven solutions to provider groups and aggregators helping foster self-sufficiency by maximizing fee-for-service payments transitioning them to value-based programs and preparing them for risk continuum also collaborates with payers to help drive value-based adoption among providers and improve the health outcomes of patients the company optimizes performance through revenue cycle management value-based care practice management services and specialty care solutions more than 1 500 primary care physicians specialists and nurse practitioners caring for more than 1 million patients depend on continuum’s business and clinical experts to help achieve their goals learn more at www continuumhealth net assist in maintaining the database collection system data informatics and other tasks that optimize efficiency and quality through the enterprise data warehouseparticipate in the data ingestion processes to assure data integrity quality and data validationcollaborate with the analytics team members participating in day to day operational demandsmentor and oversees the activities of the informatics data analystcollaborate with management to prioritize client business and information needsidentify errors in data and take measures to resolveparticipate in building and operationalizing processes that ensure timely data-loading and maintain accuracy and relevance of data used promote continuum’s interest in maximizing consistent quality data from payers and other source systemsmaintain a positive working relationship with all data supplying organizationsincorporate client (internal external) feedback and experience into solutions to foster continual reporting improvementsresponsible for maintaining a high level of customer satisfaction with clients including consistent deliverables & reliabilitysupport and executes change management activities regarding data input ensuring constant compliance with standards and to effect change where meritedescalate & prioritize issues as needed to informatics managementparticipate in the professional performance planning process with management including metrics education and professional growth responsible for timely and appropriate communications through e-mail meeting documentation and verbal written correspondenceperform other related duties as required by leadership and managementminimum 5+ years’ experience with management of informatics tools or other equivalent analytics tools required for the rolefluency in informatics tools: sql tableau ssas srs hadoop; etladvanced microsoft office skills primarily in ms excelproven working experience as a data coordinatorthorough understanding of the integration between applicationstrack record of learning new skills and putting them to use immediately candidate must have a bachelor's degree in computer science business administration a related field or an equivalent combination of education and experienceknowledge & skillsworking knowledge of information care and analytics principlestechnical expertise regarding data models database design development data mining and segmentation techniquesability to understand and interpret technical conceptsability to prioritize work and meet project deadlines for informatics teammust possess strong interpersonal and communication skills with the ability to work effectively with a wide range of groupsmust have ability to communicate effectively both orally and in writingmust have excellent follow through and attention to detailability to work collaboratively with technical resources (sys admins dba’s da’s etc )experience with working under pressure and in a fast-paced environment exceptional attention to detail ability to multi-task and meet tight deadlines the role of the data architect is to expand the company's use of data as a strategic enabler of corporate goals and objectives the enterprise data architect will achieve this by strategically designing developing and implementing data models for enterprise-level applications and systems these models shall be architected at the following layers: conceptual logical business area and application this individual will act as the primary advocate of data modeling methodologies and data processing best practices essential functions:-in conjunction with data users department managers clients and other key stakeholders develop and deliver long-term strategic goals for data architecture vision standards and data management roadmap -create short-term tactical solutions to manage performance data for all departments work with stakeholders to design and generate performance reports kpi dashboards and facilitate reporting to executive management -establish processes for governing the identification collection and use of corporate metadata; take steps to assure metadata accuracy and validity -enforce the organization's data management policy-establish methods and procedures for tracking data quality completeness redundancy and improvement -conduct data capacity planning life cycle duration usage requirements feasibility studies and other tasks -create strategies and plans for data security backup disaster recovery business continuity and archiving -ensure that data strategies and architectures are in regulatory compliance -ensure the success of enterprise-level application rollouts (e g erp scm crm sap peoplesoft etc )-coordinate with vendors and service providers to select the products or services that best meet company goals -assess and determine governance stewardship and frameworks for managing data across the organization -develop and promote data management methodologies and standards -select and implement the appropriate tools software applications and systems to support data technology goals -oversee the mapping of data sources data movement interfaces and analytics with the goal of ensuring data quality -collaborate with project managers and business unit leaders for all projects involving enterprise data -address data-related problems in regards to systems integration compatibility and multiple-platform integration -act as a leader and advocate of data management including coaching training and career development to staff -develop and implement key components as needed to create testing criteria in order to guarantee the fidelity and performance of data architecture -document the data architecture and environment in order to maintain a current and accurate view of the larger data picture -identify and develop opportunities for data reuse migration or retirement -manage systems applications or appliances and their associated operating systems software and reporting tools -design implement and administer equipment hardware and software upgrades specific to enterprise database systems -interact and negotiate with vendors and contractors to secure products and services -collaborate on the development implementation and maintenance of policies procedures and associated training plans for the enterprise -administer and maintain end-user accounts permissions and access rights -perform server and security audits under direction of sr security staff or exec it mgmt -perform backups and recovery -monitor and test performance and provide performance statistics and reports -collaborate and recommend schedule and perform improvements upgrades and repairs (other duties and assignments as may be assigned at the sole discretion of the employer)qualifications:-college diploma or university degree in computer science information systems or computer engineering -certification in database administration (e g cdp-dba) is required -at least two (2) years' work experience as a data professional or information architect -in-depth technical knowledge of network pc and platform operating systems including windows desktop and server operating system -working technical knowledge of current systems software protocols and standards -strong understanding of database structures theories principles and practices -working technical experience with designing building installing configuring and supporting database platforms including but not limited to: ms sql my sql mongo -hands-on experience with data architecting data mining large-scale data modeling and business requirements gathering analysis -direct experience in implementing enterprise data management processes procedures and decision support -hands-on database tuning and troubleshooting experience -experience with data processing flowcharting techniques -experience working in a fast-paced matrix project management environment-knowledge of applicable data privacy practices and laws skills required:-must be able to conduct business analysis and generate technical requirements-adept at requirements analysis entity relationship planning and database design-excellent client user interaction skills -strong understanding of relational data structures theories principles and practices -strong familiarity with metadata management and associated processes -hands-on knowledge of enterprise repository tools data modeling tools data mapping tools and data profiling tools -demonstrated expertise with repository creation and data and information system life cycle methodologies ability to manage data and metadata migration -understanding of web services (soap xml uddi wsdl) -object oriented programming experience (e g using java jsp j2ee ejb net websphere etc )-excellent communication and written skills-must be able to prioritize and multi-task-navigate and negotiate through difficult situations-ability to communicate with all levels of employeesscope of responsibility & positions supervised:-this position will require frequent interaction with senior executives -requires independent decision making -no direct reports and must be able to operate effectively without close supervision special working conditions:-on-call availability for all weekends and select days per month -sitting for extended periods of time -dexterity of hands and fingers to operate a computer keyboard mouse power tools and to handle other computer components -occasional inspection of cables in floors and ceilings -lifting and transporting of moderately heavy objects such as switches routers computers and peripherals dataarchitect managerfull-time (40-hours week) position under thegeneral direction of the director of information technology responsible for: leadinga team of data architects to ensure the effective installation operation maintenance data integrity performance of the agency's databases andsupporting infrastructure including: microsoft sql report server back officeand internal information servers electronic health record (ehr) system financialand fundraising systems executive dashboards and help desk software position responsibilities include but are not limited to:supervise guide and mentor the agency data architect team lead theconceptualization and delegation of tasks for complex high-level projects toensure they are on track and in alignment with the agency's strategic goals architectdata infrastructure to scale with organizational growth and improve processessuch as system level monitoring business continuity backups etc improve and standardizethe systems development life-cycle (sdlc) for faster delivery of robust systemsto stakeholders improve thetesting environment to resemble the production environment throughout the sdlcfor better identification of defects before delivery work with thedirector of information technology to ensure effective data security and hipaacompliance of agency data systems ensureeffective and accurate data integration with agency data systems such as ourelectronic health record system human resources management system financialand fundraising systems respond torequests for data reports; track data analyze and provide reports identify own track and resolve database incidents and problems efficiently and quickly;proactively escalate any issues that can't be resolved within establishedtimeframes establish andmaintain effective communications and working relationships with technologycustomers within the agency to keep them updated on their requests resolveproblems etc positionrequirements include but not limited to:a bachelor'sdegree preferably in computer science computer engineering informationsystems or related field minimum fiveyears' experience and demonstrable skills and knowledge in relational databasesand database technologies including microsoft sql toad for oracle and php webdevelopment minimum oneto three years' experience supervising a team of technical staff projectmanagement skills certified project management professional (pmp) or credittoward a pmp is a plus experiencewith data extraction database reporting and analysis tools upgrading andmaintaining database systems and programming custom web pages location: pasadena ca interested candidates may contact vibhor babbar at 609-371-5400 x 305 or vibhorb@vgroupinc com for further information:direct end client: state of new yorkjob title: data modelerduration: 24 monthsstart date: april 30 2018location: albany ny 12229position type: contractinterview type: phone webex in-personrequirement id: sny_datam326_vbrequired skills: • 72 months of demonstrated hands-on experience with developing and maintaining logical and physical dimensional data models building data migration strategies utilizing sound concepts of data modeling including star schema snowflake schema etc in a data warehouse business intelligence environment • 72 months of demonstrated hands-on proficiency in a data modeling tool (e g erwin) • 72 months of demonstrated hands-on experience in relational data management disciplines and design and development experience using oracle technologies • 72 months of demonstrated hands-on experience utilizing sql programming for data analysis purposes in a data warehouse business intelligence environment • 72 months of demonstrated hands-on in-depth experience with business intelligence tools (e g obiee 11g) involving data analytics skills • 72 months of demonstrated hands-on in-depth experience with designing developing and creating applications and health care business intelligence reporting solutions • 48 months of demonstrated analytics and communications experience (e g math statistics quantitative methods and writing) • 48 months of demonstrated hands-on experience with reviewing the data modeling of others for compliance and accepted standards of development • bachelor’s degree in business intelligence information science computer science or mathematics department: the new york state office of mental health (omh) project or program name: eim data warehouse state information systems and various health cluster projects as necessary description: this data modeling position is required to support the continued development and approval processes of business solutions in response to critical agency requirements for a portfolio of agency products the data modeler will participate in a team approach to:• work with smes business analysts and technology teams to understand the data requirements and the full attribute set for entities and dimensions;• convert business processes domain specific knowledge and information needs into a conceptual model;• convert conceptual models into logical models with detailed descriptions of entities and dimensions;• develop and maintain fully defined conceptual logical and physical dimensional data models to ensure the information models are capable of meeting end user and developer needs;• develop data models and data migration strategies utilizing sound concepts of data modeling including star schema snowflake schema etc • model aggregation layers and specific star schemas as subject areas within a logical and physical model• understand and meet referential data integrity requirements• document decisions made in meetings and alternative solutions discussed ______________________________________________________________________________v group inc is an it services company which supplies it staffing project management and delivery services in software network help desk and all it areas our primary focus is the public sector including state and federal contracts we have multiple awards contracts with the following states: ar ca de fl ga il ky md me mi nc nj ny oh or pa sc tx va and wa if you are considering applying for a position with v group or in partnering with us on a position please feel free to contact me for any questions you may have regarding our services and the advantages we can offer you as a consultant please share my contact information with others working in information technology website: www vgroupinc com twitter: vgroupitservices@vgroupitservicefacebook: www facebook com vgroupit______________________________________________________________________________ - provided by dicedata modeling data warehouse business intelligence erwin oracle sql obiee 11g bachelors degree in business intelligence information science computer science or mathematics title: aws enterprise architect - data engineering type: full time perm rolelocation: stamford ctmust have: strong aws cloud experience (great understanding of cloud architecture for data processing and build new data managemnt platform on aws) us citizens and green card holders and those authorized to work in the us are encouraged to apply we are unable to sponsor h1b candidates at this time”job description:the enterprise data architect is the leader and subject matter expert that will define implement and govern the information and data architecture strategy specifically on designing and implementing the flow transformation storage and visualization of consumer data this role will be a key thought leader for evaluating new technologies and solutions and will partner with data analytics and other technology teams our environment is dynamic fast-paced and lots of fun key responsibilities:in depth knowledge in big data solutions and hadoop ecosystem to evaluate recommend and implement technologies for data ingestion etl storage and reportingable to design data lake architecture as a centralized data hub to deliver data on demand to manage data for accuracy currency and usagedevelop standards for naming describing governing managing modeling storing cleansing transforming searching and delivering all consumer data (which includes methodologies tools governance and conventions)provide architecture guidance solution architecture technology leadership best practices detailed design and lead development effortswork with the data analytics team (i e data scientists) to translate business requirements to functional requirements and implement realistic technology solutions for the sameunderstand how the data relates to the current operations and the effects that any future process changes will have on the use of data in the organizationwork with team members and other technology groups to ensure all systems are scalable optimized for performance have full dr redundancy and are secure build coe for data information life cycle management governance lineage and qualitymanage custom development of data integration platform for big data platformqualifications: 10+ years in enterprise data architecture8+ years of experience architecting and supporting high-performance highly-available and scalable information management solutions10+ years of experience with implementing data reporting integration and visualization tools (i e cognos informatica python spark tableau etc )proven expertise in relational and dimensional data modelingstrong understanding of cloud architecture specifically aws as it relates to data processing (i e ec2 s3 redshift etc )aws certification preferredin depth knowledge in big data solutions and hadoop ecosystemunderstand pii standards processes and security protocolsfamiliarity with data anonymization concepts and technologies preferredexperience leading and architecting enterprise wide initiatives specifically system integration data warehouse build data mart build data lakes etc for a large enterpriseable to confidently express the benefits and constraints of technology solutions to technology partners stakeholders team members and senior levels of managementexperience implementing and supporting operational data stores data warehouses and data marts and data integration platforms experience leading information management related initiatives (system integration data warehouse build data mart build or similar)experience with physical logical and conceptual data modellingbasic ability to manage implement and write etl scripts (experience with python & spark preferredwilling and able to work flexible hours and be “on-call” as neededrelevant college degree in computer science (or related technical field) preferred - provided by dice the information & analytics initiative is helping boeing lead the aerospace industry by providing seamless enterprise wide access to valuable information and analytics for increased efficiency business intelligence and profitable growth the platform is the backbone of the boeing analytx capability it aids in data discovery analytics solution development and delivery data scientists and application developers can rapidly utilize data assets from across the enterprise to deliver insights and value to the business and to our customers you will be part of analytx platform and data team and will partner closely with a team of data technologists data scientists and business analysts leading boeing’s big data strategy you will architect and implement these road maps and bring to life revolutionary new analytics and insights you will provide technical direction to the engineering and application team you will collaborate with internal functions to utilize the new big data tools responsibilities:* lead a team of technical resources performing solution development in the big data ecosystem * provide architecture and technology leadership across batch and streaming data processing platforms* focus in one or more core areas: hive hbase kafka storm spark spark streaming nifi and other tools within the big data ecosystem* design and develop data pipelines (code scripting tooling) for both structured and unstructured data* participate in requirements gathering and design technical workshops with platform users * estimate new projects* evaluate new big data technologies* leverage the hadoop ecosystem to manage data at scale* integrating with external data sources and apis* designing building and delivering apps following industry best practices* work with the developers business analysts and subject matter experts to understand the complex technological system in order to produce integrated end-to-end solution optionsboeing is the world's largest aerospace company and leading manufacturer of commercial airplanes and defense space and security systems we are engineers and technicians skilled scientists and thinkers bold innovators and dreamers join us and you can build something better for yourself for our customers and for the world *typical education experience:*technical bachelor's degree and typically 9 or more years' related work experience or a master's degree with typically 7 or more years' or a phd degree with typically 4 or more years' related work experience or an equivalent combination of education and experience a technical degree is defined as any four year degree or greater in a mathematic scientific or information technology field of study *work authorization:**this position must meet export control compliance requirements therefore a "us person" as defined by 22 c f r § 120 15 is required "us person" includes us citizen lawful permanent resident refugee or asylee *job responsibilities:• bachelor’s degree • experience in software development• experience building data pipelines• experience with hadoop hive spark kafka hbase• experience on hortonworks hadoop distrbution • application development on java• experience in linux • ability to "read the manual" and figure it out basic qualifications (required): experience with hadoop in a production environment experience working with linux experience building and deploying java applications experience leading a software development project preferred qualifications:sql experience**job** *adv information technologist* **organization:** *analytics&information mgmtsvcs* **title:** *advanced technologist - hadoop* **location:** *washington-bellevue* **other locations:** *united states-missouri-saint louis united states-south carolina-north charleston* **requisition id:** *1800011926* our mission as a leading investment management firm is to help our clients achieve their long-term financial goals we believe our associates are the key to this mission and we are always looking for talented individuals who share our commitment to our client’s success if you’re looking for challenging work experiences and the ability to learn in a collaborative culture we invite you to explore the opportunities available at t rowe price the senior cloud data architect acts in a consultative nature to both the business and technology in identifying researching and implementing leading edge technologies and practices this architect has a depth and breadth of professional experience in data technologies processes and practices and related areas you will utilize and apply that knowledge and expertise across the organization to ideate architect and create next generation solutions you will serve as a strategic advisor to the business and technology management on key technology solutions principal responsibilitiesyou possess a breadth and depth of experience in the information architecture domain and can lead our data solution development efforts in close partnership with our business stakeholders and technology delivery teams your prior experience includes but not limited to developing reference architectures for data acquisition storage and distribution data products solution evaluations business intelligence data warehousing metadata management data quality management data modeling and mdm you will be part of the enterprise data architecture team and help steer strategic technology direction define target state architecture technology roadmaps build reference implementations and mentor others while championing and maturing the enterprise architecture practice the person in the role is expected to execute on following responsibilities:+ build and maintain the enterprise information model and articulate the data produced and consumed by the enterprise+ provide architecture guidance on the identification definition build-out and operationalization of the strategic data entities+ ensure that the enterprise follows enterprise data standards and practices and they align with the data policy governance and security requirements+ consistent data integration and distribution patterns and practices are formulated socialized and implemented to ensure integrity of the strategic data across the firm+ architect and follow through the implementation of the enterprise data lake on aws technologies to better manage our data lifecycle and enable next generation data analytics and operational use cases+ prepares research presentations whitepapers proposals and sample applications that demonstrate how technology can affect often complex systems and increase the effectiveness of the firm+ partners with the channel manager and the business unit leaders and managers to define and review technological strategies+ provides technical oversight to the implementation of technical strategyqualificationsrequired+ 10+ years’ of hands-on technical experience in an architecture role in the financial services industry and or asset management space+ experience architecting & managing data integration & data warehouse platforms on cloud solutions such as aws+ data modeling & solution design experience in oltp & olap environments+ strong analytics & reporting skills – experience with well-known bi data blending data virtualization & reporting tools+ strong data integration skills and experience especially around moving large data sets in batch & near real time across the cloud+ expertise in building internet scale solutions in the cloud+ expert level hands-on experience with traditional rdbms platforms like oracle db2 & ms-sql+ experience in migrating traditional rdbms to aws based database solutions+ expert level hands-on sql & procedural sql coding skills – past experience in a dba role helping performance tune queries and data stores+ demonstrated success engaging business partners in a consultative manner and turning business concepts into well designed technology solutions+ business value focused mindset balancing tactical and strategic needs+ proven ability to work well and influence others strong inter-personal skills written and verbal communication skills ability to generate great content and present them effectively+ must have a thorough understanding of agile development methodologiespreferred+ advanced degree in technical engineering or related quantitative field+ hands-on experience with nosql databases like cassandra & dynamo & big data solutions like hadoop hive pig etc + additional exposure to machine learning & data science techniques+ demonstrated successful experience implementing technology solutions resulting in significant impact to the business+ aws associate solution architect certification+ previous experience as a data engineer or data scientist+ previous experience in architecting and developing a commercial product + experience with web and enterprise content management tools and web services soat rowe price is an equal opportunity employert rowe price is an asset management firm focused on delivering global investment management excellence and retirement services that investors can rely on–now and over the long term ## analytics solutions architectjoin a team recognized for leadership innovation and diversityhoneywell is transforming from a traditional industrial company to a contemporary digital industrial business harnessing the power ofcloud big data analytics internet of things and design thinking\ we are leading change that brings value to our customers partners and shareholders\ we do this through the creation of innovative software and data\-driven products and services\ honeywell is at the forefront of this change and we are looking for a team member who thrives on challenging the status quo welcomes new technology and intelligent risk taking has real passion for innovation is results\-oriented and excels in a dynamic environment\ honeywell’s products services and digital environments collect millions of events per day and as a data\-driven company believes our data has great stories to tell us\ as such we are investing heavily in our capabilities and establishing a data science & analytics team chartered to build our analytics capabilities to drive business efficiencies & growth opportunities\ as an analytics solution architect business engagement manager you would be responsible for engaging business leaders across the supply chain pricing finance & other business functions to identify opportunities where we could leverage data & analytics to deliver value and enable their strategies\ working closely with the business smes you will capture critical requirements problem statement establish project scope and deliverables\ using your expertise in data analytics and business consulting you will convert business problems into analytics requirements for the data science teams\ you will also be involved in leading complex analytics initiatives focused on building new predictive analytics capabilities that deliver customer and business value\ you will lead cross\-functional analytics teams from project inception to analytics delivery\ 25 business partnering25 analytics solutioning25 program management25 communicating### you must have+ bachelor’s degree in predictive analytics applied mathematics applied statistics computer science actuarial science business analytics business intelligence & analytics economics marketing analytics mathematics mba with technical undergrad operations research quantitative finance or equivalent\ + 3 years of experience in data science six sigma or related role delivering advanced analytics solutions with a background in data mining statistics or similar technical fields\ + 3 years experience with statistical packages and applications \(such as python r sas spss stata and matlab\) and deep knowledge of modeling and business analytics techniques\ + 5 years of experience in technology or management consulting change management or program management and have demonstrated the ability to lead complex engagements in addition to the person influencing the delivery of complex large\-scale business\-driven technology solutions to senior executives\ 3 years of experience in data science six sigma or related role delivering advanced analytics solutions with a background in data mining statistics or similar technical fields\ 3 years experience with statistical packages and applications \(such as python r sas spss stata and matlab\) and deep knowledge of modeling and business analytics techniques\ 5 years of experience in technology or management consulting change management or program management and have demonstrated the ability to lead complex engagements in addition to the person influencing the delivery of complex large\-scale business\-driven technology solutions to senior executives\ ### we value+ experience in data architecture data management and data warehouse and or hadoop environments is highly desired\ + ability to understand business problems and a passion for applying technology solutions and leveraging technology trends to deliver results\ + proven ability to work closely with a multitude of organizations \(frequently geographically distributed\) to deliver presentations and project tasks + expertise in agile methodology + strong problem solving troubleshooting ability\ + experience working with remote and global teams in a matrix organization + experience in manufacturing industry + customer focus and process discipline with excellent collaboration skill\ + results driven with a positive can\-do attitude + travel up to 30% + database experience \(including sql procedural sql and etl\) in relational database environments + experience with distributed computing platforms such as hadoop and associated technologies such as mapreduce and hive would be advantageousdue to us export control laws must be a us citizen permanent resident or have protected status\ exempt how honeywell is connecting the world### includes+ continued professional development+ some travel required### additional information+ **job id:** req135167+ **category:** business development+ **location:** 115 tabor road morris plains nj 07950 usahoneywell is an equal opportunity employer\ qualified applicants will be considered without regard to age race creed color national origin ancestry marital status affectional or sexual orientation gender identity or expression disability nationality sex or veteran status\ honeywell is an equal opportunity employer\ qualified applicants will be considered without regard to age race creed color national origin ancestry marital status affectional or sexual orientation gender identity or expression disability nationality sex religion or veteran status\ for more information on applicable equal employment regulations refer to the eeo is the law poster \ please refer to the eeo is the law supplement poster & the pay transparency policy \ if a disability prevents you from applying for a job through our website request assistance here \ no other requests will be acknowledged\ terms & conditions | privacy statement © 2017 honeywell international inc\ lead end to end solutions delivery of large scale enterprise hana applications architect design and develop large scale and optimized data warehouses and bi solutions using sap hana lead data modeling and database design & development using hana modeling techniques lead reporting and ui design and development translate complex business requirements into scalable technical solutions meeting data warehousing standards provide subject matter expertise to hana needs for the multiple functional areas conduct design and code reviews as per client standards benchmark application performance periodically and fix performance issues articulate the hana cloud business capabilities and value differentiate hana cloud and on premise solutions define methodology tools and processes to ensure successful customer adoption position and deliver transition roadmaps between on-prem and cloud solutions experience in cloud solutions (roadmaps security integration) extensibility with sap hana cloud platform (apis odata restful) architecting developing implementing and installing sap hana cloud good understanding of network and bandwidth requirements for cloud and on-prem required background: strong hands-on experience in data warehouse development in sap hana expertise in facilitating end reporting solutions based on sap data using sap hana experienced in core hana data modeling tuning and reporting data modeling experience in building logical and physical data model deploying solutions using version control management system hana data modeling expertise huge plus as a data scientist ability review and gather business requirements and interpret these into an architectural design and information model in sap hana environment experience with sap (native) hana data modeling schema management content import & export data provisioning technology including sap lt replication server data services and other db systems experience with various sap business objects (bobj) tools development lifecycle on sap (native) hana from design through implementation and production support experienced in sap hana information models and hana views (attribute analytics and calculation views) with graphical and sql scripting sda (smart data access) sap hana live (analytics) strong experience in performance tuning and query optimization identify bottle neck in performance in terms of reporting preferred qualifications prior experience with large data volumes and etl development tools is preferred experience working with sap hana features and traditional sap bw is a plus **business intelligence solutions developer****preferred qualifications**the business intelligence solutions developer will be a technology evangelist customer oriented results driven and passionate about providing business intelligence solutions to the supply chain operations organization and beyond you will have solid understanding of supply chain coupled with strong technical and analytical skills you must be a self-starter and thrive in a fast-paced environment with the desire to solve real business problems with little guidance you will partner closely with business leaders to develop a deep understanding of their business needs create innovative insightful and user-friendly tools or reports that enable them to manage their businesses more effectively you will be responsible for designing and implementing solutions using both third-party and in-house reporting tools modeling metadata building reports and dashboards and administering the platform software in order to create a culture where data and analytics are the foundation for a solid decision making process **responsibilities:*** provide and manage ad hoc reporting standard dashboards repositories and analysis to support high priority initiatives across the organization* drive significant advanced analytics initiatives end to end and across multiple products and multiple layers of architecture* participate in analysis of production incidents and complex customer issues as needed* well versed in modern software development best practices including use of continuous integration approaches including agile development methodologies* ability to collaborate and communicate effectively with cross-functional team members and management to drive bi initiatives across the organization**desired skills and experience*** 5 years of supply chain operations* strong technical accomplishments in sql etl and data analysis skills* write and debug complex sql queries* system administration skills* strong data warehouse and data mart concepts* experience with unix linux* deep understanding of data architecture and data management principles* languages: sql java python c r* dw etl: oracle databases (dba if possible)* code repository: familiar with git cvs* reporting platforms: obiee (admin if possible) dv tableau apex* other: jira confluence rest web services**preferred qualifications*** experience with data visualizer* the ability to communicate complex information clearly and concisely through data analysis * highly organized and able to prioritize and work to tight deadlines * able to build positive business relationships and work collaboratively across organizational boundaries * excellent verbal and written communications skills * proven performance working independently * experience in agile scrum methodology a plus* knowledge of data modeling tools (erwin embarcadero) is a plus* data science – forecasting and analysis methodologies**detailed description and job requirements**design develop troubleshoot and debug software programs for databases applications tools networks etc as a member of the software engineering division you will take an active role in the definition and evolution of standard practices and procedures you will be responsible for defining and developing software for tasks associated with the developing designing and debugging of software applications or operating systems work is non-routine and very complex involving the application of advanced technical business skills in area of specialization leading contributor individually and as a team member providing direction and mentoring to others bs or ms degree or equivalent experience relevant to functional area 7 years of software engineering or related experience **oracle is an equal employment opportunity employer all qualified applicants will receive consideration for employment without regard to race color religion sex national origin sexual orientation gender identity disability and protected veterans status or any other characteristic protected by law ****job:** all roles**location:** us-ca california-santa clara**job type:** regular employee hire**organization:** oracle if you are an experienced software engineer with a passion for designing and delivering big data solutions using cutting edge technologies want to be a part of exciting data simplification journey looking for a collaborative team environment where you will have a wealth of opportunities to innovate and intellectual curiosity to learn a career in customer data technologies in pi may be right for you!at fidelity we love to use data and analytics to personalize incredible customer experiences and develop solutions that help our customers live the lives they want as part of our digital transformation we have significant investments in innovative big data capabilities to support our customer obsessed culture and growing data science practice we are looking for a hands-on data technologist that can help us build our next generation cloud enabled data ecosystem the customer data technology group within fidelity’s personal investing (pi) organization is seeking a big data engineer to be part of an exciting and fast-paced team focused on designing and implementing large-scale distributed data processing systems using cutting edge cloud based technologies in this role you will implement a variety of data integration frameworks to ingest data to a data lake that enables our data scientists to explore and derive insights quickly this position is a critical element to delivering fidelity’s promise of creating the best customer experiences in financial services *the expertise we’re looking for** bachelor’s or master’s degree in a technology related field (e g engineering computer science etc ) required * 3 years extensive experience in object oriented programming (java scala python)* 3 years of hands-on experience in implementing data integration frameworks to ingest terabytes of data in batch and real-time to an analytical environment* 2 years of big data processing technologies such as spark hadoop etc * 1 years of experience in developing big data applications in cloud preferably aws is highly desirable* deep knowledge of database technologies such as relational and nosql* experience with devops continuous integration and continuous delivery (maven jenkins stash ansible docker)* strong knowledge ofdeveloping highly scalable distributed systemsusing open source technologies* solid experience in agile methodologies (kanban and scrum)*the skills you bring** ability to think out of box and design end-to-end solutions* passion and intellectual curiosity to learn new technologies and business areas* ability to deal with ambiguity and work in fast-paced environment* excellent communication skills both through written and verbal channels* excellent collaboration skills to work with multiple teams in the organization* ability to understand and adapt to changing business priorities and technology advancements* strong knowledge and technology trends in implementing of big data ecosystem* solid understating of data architecture patterns such as lambda kappa event-driven architecture data as a service microservice etc *the value you deliver** designing building and supporting mission critical applications to provide the best customer experience* exploring new technology trends and leveraging them to simplify our data ecosystem* driving innovation and leading the team to implement solutions with future thinking* collaborating with internal and external teams to deliver technology solutions for the business needs* guiding teams to improve development agility and productivity* resolving technical roadblocks to the team and mitigating potential risks* delivering system automation by setting up continuous integration continuous delivery pipelines* acting as a technical mentor to the team and bringing them up to speed on latest data technologies and promoting continuous learning*how your work impacts the organization***customer data technology in pi supports the platforms that enable business users to collect and analyze the customer data needed to provide the best customer experience company overviewat fidelity we are focused on making our financial expertise broadly accessible and effective in helping people live the lives they want we are a privately held company that places a high degree of value in creating and nurturing a work environment that attracts the best talent and reflects our commitment to our associates for information about working at fidelity visit*fidelitycareers com* fidelity investments is an equal opportunity employer **job:** **database***title:** *big data engineer***location:** *nc-durham***requisition id:** *1709221* location fremont ca san francisco caduration 6 months plus chief architect level person with hands-on coding experience is required python knowledge on lambda functionsaws architecturejson xmlexposure to apache flaskunderstanding of deploying data science models on cloudability to lead and work with offshore team of 5-6 membersgood communication skills thanks thanks regards shrijeet nairetouch systems6627 dumbarton cir fremont ca 94555 direct line : 510 795 4800 # 173fax: (510) 795-4803 e-mail: snair@etouch neturl: http: www etouch net - provided by dice python rest webservices aws team daugherty is in search of a data architect to join our growing team in minneapolis in this role you will create business intelligence and information management solutions for clients in the metro area you will have the opportunity to work with analysts technical team leads and software engineers on business intelligence related architectural issues and best practices as a data architect you will be responsible for the following:* creating and designing information management solutions * managing other business intelligence analysts and developers throughout the project lifecycle * assisting in the design and development of dashboards reports and other business intelligence metrics * acting as lead and subject matter expert for clients including training design sessions workshops and project meetings * collaborating with clients to design information management and business intelligence solutions * applying best business intelligence practices including data manipulation and data visualization * identifying and documenting potential data sources and flows * analyzing existing enterprise data warehouse structures to determine relevance to business needs we are looking for someone with: * a 4-year college degree (preferably in computer science or business) * 10+ years of experience in information management data analysis and business intelligence * strong sql skills * demonstrated data analysis skills * an expertise in two integration tools (dts ssis informatica data stage etc ) * proven experience in data modeling architecture (operational data warehouse and data mart design) * knowledge of case tools (erwin oracle designer 2000 er studio etc ) * experience in olap tools (analysis services cognos business objects etc ) * experience using reporting tools (crystal reports proclarity etc ) we offer members of team daugherty: * excellent health dental and vision insurance * revenue sharing and a 401(k) retirement savings plan * life disability and long-term care insurance * little to no travel * robust career development and training our company challengesempowering clients with highly rewarding data discovery and licensing toolsingesting and managing billions of healthcare records from a wide variety of partnersstandardizing on common data models across data typesorchestrating an industry-leading hipaa privacy layerinnovating our proprietary de-identification and data science algorithmsbuilding a culture that supports rapid iteration and new possibilitiesthe infrastructure and culture we are building will provide an environment that cultivates innovation we want to move fast knowing we can fix anything we break along the way if a new need arises we want to turn around a solution quickly we want to solve our challenges in ways that create even more possibilities we’re creating a platform that lets us discover what else we might do how you will helpyou’ll be embarking on a hands-on role from ingestion up to making all data available in the warehouse layer in standardized form and ready to use you’ll map the data build the models assess data quality and qa the transformed results you’ll reinforce data architecture standards and data warehouse policies and procedures ensuring that clients receive the highest quality data you’ll use the best tools for the job whether modern and revolutionary or time tested and proven to deliver elegant scalable solutions that meet business and technical needs your team will support you and you the same peer review of solutions and implementations is expected you will play an integral part in building the foundation of everything to come what you will domap de-identified healthcare data to proprietary data structuresperform troubleshooting on all etl processes and resolve issues effectivelydevelop company standards in terms of nomenclature storage design and deploymentsprovide support to all data warehouse initiativesdevelop and improve qa test plans for incoming datadevelop qc routines for data processing and ingested datawork with qa automation framework that provides real time alerts on data ingestion drop rate data processingimplement systems for tracking data quality and consistencyabout youyou listen carefully absorb information well and take the initiative to implement improvementsyou have worked extensively with ehr medical claims pharmacy and or lab data and are fluent in healthcare it standardssql you know it write smart queries it’s no big dealyou are data driven testing and measuring as much as you cangood understanding of hipaa rules and requirementsdesired skills and experience5+ years of hands-on sql5+ years of experience with either ehr medical claims pharmacy data and or lab dataunderstanding of healthcare it standards and data structures such as ansi 837 835 ncpdp d 0 hl7 and optionally hl7 ccr ccd or fhir 3+ years of experience in data warehousinghive spark sqlwe have big planswe are building a platform that will scale to support an ever-growing array of data providers and innovative products you must be able to think big while still delivering on near-term requirements about healthverityhealthverity headquartered in philadelphia is a venture-backed startup leveraging state-of-the-art technologies to empower healthcare organizations to discover license and link traditional and emerging data for advanced analytics by applying modern solutions to longstanding problems we see a significant opportunity to improve the way our clients engage with patients this is a unique opportunity to join an experienced and entrepreneurial management team to address a large unmet need and build the industry’s most flexible and scalable cloud platform for large-scale healthcare data healthverity is an equal opportunity employer we are seeking data stewards for a long term (10 month) with our austin texas based client the data steward performs senior-level business process analysis data analysis and assistance on system-wide data initiatives the data steward's responsibilities may include:•establishing and maintaining a robust outreach program to hhs programs and business process experts•socializing concepts best practices and encouraging a culture that fosters each individual in the organization’s roles and responsibilities with respect to data quality•working with it project management staff to ensure enterprise information management (eim) requirements are implemented in new or enhanced systems•analyzing business processed to determine how data is generated used or exchanged•identifying opportunities to reuse data or business processes to achieve hhs program objectives and ultimately facilitate data-driven decision making through advancing the consistency and quality of hhs data•other duties as assigned :skills experience8experience performing data and process analysis to identify and assess errors and anomalies and determine root cause; works with data owners to correct data generation collection and business processes8previous experience analyzing and standardizing data in large eim and master data management projects and teams6previous experience documenting data processes business and technical metadata and preparing reports 4has experience identifying analyzing and resolving complex data issues across multiple environments 4experience with data and dimensional modeling tools4experience with informatica or other etl tools to perform data reconciliation data standardization data cleansing risk data warehousing and data mapping analysis for data translation activities strongcommunication analytical and interpersonal skills at all levels strongability to work on multiple projects or project assignmentspreferred: yearsskills experience2previous health and human services industry experienceexperience working with texas medicaid management information system (mmis) data and texas integrated eligibility redesign system (tiers) a plusbachelor's degree in computer science systems engineering or equivalent experience our client headquartered in fort washington pa is hiring a data architect to join their team this is a full-time direct hire opportunity working on-site in their headquarters this role will be responsible for data architecture including mapping systems and interfaces used to manage data review set standards for data management analyze current state and design desired future state this role will have overall responsibility to manage data across different analytical platforms and use cases within various company business units function related activities key responsibilitiesset vision and strategy for the leveraged and harmonized use of data to: improve system stability reduce system handoffs error enable advanced analytics and empower marketing to mine customer data and define use cases that would accelerate revenue growth and creating competitive advantageconceptualize and influence data design with core understanding of data storage and retrieval patterns establish and drive architecture delivery monitoring and value measurement for strategic bi and analytics initiativesadopt and drive new technology areas including columnar and nosql databases predictive analytics data visualization and unstructured data to guide the organization in understanding and adopting them drive innovation and maintain influential knowledge of industry trends in strategic areas of importance such as machine learning ai and iot areas continuously provide detailed accurate and timely updates on current activities to all stakeholders including finance and ecommerce should be passionate about identifying and solving problems for customers and uncover customer needs through direct interaction as well as quantitative or qualitative researchcreate data monitoring models and develop database design and architecture documentation for the management and executive teamsconsult with solution project teams to develop a solution design that is compliant with the architecture and standards of the company establish and maintain the architecture and standard technology products to enable consistent reusable and efficient practices within the bi analytics function design application software components including specifications of audit controls exception and error handling security retention procedural or recovery logic to construct the application develop application specific standards and procedures such as how errors will be handled in the application to complete detailed design education requirements:bachelor’s degree requiredmanagement information systems computer science or related field preferredrelated work experience:7+ years of experience in enterprise data warehouse data integrations5+ years hands-on experience with modeling tools such as (but not limited to) erwinexperience working with business teams to understand the requirements and convert them into logical and physical data modelsexperience in working knowledge of application architecture patterns nosql database infrastructure and robust asset and data migration toolslarge scale solution delivery involving diverse stack of technologiescomfortable and confident when speaking with clients as a technical expert and able to narrate data driven insights and translate technical concepts into simple terminology for business client of various levels passionate about identifying and solving problems for customers with the ability to uncover business needs through direct interaction as well as quantitative or qualitative research to define compelling solutionsexcellent understanding of agile devops operating model experience and passion to work in a fast-paced agile environment delivering functional features in small time duration while utilizing automation tool setsteam player with the ability to multi-task in a fast-paced dynamic environmentproven ability to manage complex processes and drive continuous process improvementfoster a metrics-driven culture to drive accountability and transparencyself-starter self-confident and assured in personal abilitiesadditional skills (preferred)technology certifications (e g sap salesforce amazon etc…)deep understanding of:cloud and aws offerings supported by large implementation experiencesdata concepts (e g etl near- real-time streaming data structures metadata and workflow management)data integration toolsbig data (e g hadoop flume hbase hive map-reduce oozie sqoop spark athena)devops tools (e g bamboo bitbucket puppet jenkins chef docker etc…)proven experience in interfacing highly complex mobile & web solutions with apis and enterprise platforms services consulting experience rue21 is looking for an experienced data architect we are looking for an individual who will take ownership of architectural oversight and guidance for the data warehouse and business intelligence platforms at rue21 the right candidate should be a self-motivated highly detail-oriented team-player with a positive drive to strategize and implement analytical platforms this role will also drive ongoing analysis and design to all supporting systems that will form the backbone rue21’s business intelligence and analytical systems own the architecture design and development of reporting and analytical solutions using the best tools suited for rue21’s needdevelop and maintain the organization's strategy for managing information ensuring that uniformly recognized and accepted data definitions are developed and applied throughout the organization ensure proper documentation in developed to support rue21’s enterprise reporting systems understand and evaluate business requirements and translate into specific analytical solutions provides technical coordination of project teams and direct administrative supervision identify and resolve production and application issues 5+ years of it experience with several years in hands on data architecture modeling and strategy; with majority of it earned in building enterprise level platformsat least 2 years in developing big data solutions and architecturestrong metadata modeling experienceexperience working with data visualization tools (i e tableau microstrategy domo etc )a strong analytical and quantitative analysis mindsetability to think strategically and translate plans into phased actions in a fast paced high pressure environmentdimensional and or multidimensional modeling experience is a pluse-commerce or retail experience is a plusstrong business acumen and superior written verbal and presentation skillsbs ms degree in information technology or computer science or equivalent related degree 2018 - 006 data architectabout rigilrigil is an award-winning woman-owned 8(a) small disadvantaged business that specializes in technology consulting strategy consulting and product development we value teamwork and strive to build strong leaders job typefull timelocation(s)oklahoma cityduties and responsibilitiesdesign and build highly available and high-volume relational databaseswork with customers in order to analyze business requirements with implementation of those requirements into a data model or formulation of design enhancements to the existing data model to fulfil those requirements work with software architects on overall system design and architecture collaborate with data management team for enterprsie data requirements analyze develop and implement solutions that meet the companies capacity and performance requirementsanalyze and tune databases for optimal efficiencyassist with backup clustering mirroring replication and failover activitiessupport project teams with data loads to dev and test environments develops exchange data flows functional documentation logical and data models ensures customers end users developers and other stakeholders have a common understanding of models works with project teams to develop and document information requirements and business rules in the form of conceptual logical physical data models and data flow diagrams required skillsa qualified candidate will possess the following:works with other solution architects dba and application tech team to deploy high level design document and required enterprise permits experience with data modelling data mapping data profiling and data quality requires a strong understanding of db technology features and their effective deployment in production systems (oracle or sql server) requires strong sql skills requires database data modelling knowledge requires experience with physical data relational database design requires the ability to develop and maintain coherent and flexible database designs facilitating enabling the integration of multiple projects to a standard data model requires experience of design of enterprise scale database models on oracle or sql server strong analytical skills required along with proven understanding of big data high redundancy and high availability of nosql and sql databases requires effective oral and written communication skills and excellent interpersonal skills must be able to communicate and interact well with coworkers education requirementsbs in computer science or similar technical fieldrequired experiencea qualified candidate will have a minimum of 14 years of experience in database architecture and modelling application instructionsto be considered for this position please apply at www rigil com careers rigil is an equal opportunity employerrigil considers applicants for all positions without regard to race color religion sex national origin age marital status sexual preference personal appearance family responsibility the presence of a non-job-related medical condition or physical disability matriculation political affiliation veteran status or any other legally protected status rigil requires a pre-employment background investigation position summary responsibilities:we are currently seeking a motivated career and customer oriented big data engineer to join our team in alexandria va to begin an exciting and challenging career with unisys federal systems the ideal candidate should have an eye for building and optimizing data systems and will architect and build the software research and implement the base system and help chart our course forward as we continue to scale and solve interesting challenges responsibilities:work closely with other big data reporting and analytics team members to optimize the agency’s data systems and pipeline architecturedesign and build the infrastructure for data extraction preparation and loading of data from a variety of relational and non-relational sources into the big data environment and aws build data and analytics tools that will offer deeper insight into the pipeline allowing for critical discoveries surrounding key performance indicators required skills:minimum 15 years of it experience and a master’s degree in computer science information systems or equivalent quantitative field or equivalent blend of education and experience 3+ years of experience in a similar data engineer roleexperience working with large data sets and extracting from structured or unstructured datasetsdemonstrated ability to build processes that support data transformation data structures metadata dependency and workload managementstrong interpersonal skills and ability to project manage and work with cross-functional teamsadvanced working sql knowledge and experience working with relational databases query authoring (sql) as well as working familiarity with a variety of databases experience building and optimizing ‘big data’ data pipelines architectures and data sets experience with displaying data geospatially experience with the following tools and technologies: cloudera hadoop spark kafka nifi elasticsearch hive solrrelational sql and nosql databasesdata visualization tools such as d3js leaflet aws cloud services such as ec2 emr rds and redshiftstream-processing systems such as storm and spark-streamingprogramming languages and related web technologies such as python java javascript c++ json r etc do you have what it takes to be mission critical? your skills and experience could be mission critical for our unisys team supporting the federal government in their mission to protect and defend our nation and transform the way government agencies manage information and improve responsiveness to their customers as a member of our diverse team you’ll gain valuable career-enhancing experience as we support the design development testing implementation training and maintenance of our federal government’s critical systems apply today to become mission critical and help our nation meet the growing need for it security improved infrastructure big data and advanced analytics unisys is a global information technology company that solves complex it challenges at the intersection of modern and mission critical we work with many of the world's largest companies and government organizations to secure and keep their mission-critical operations running at peak performance; streamline and transform their data centers; enhance support to their end users and constituents; and modernize their enterprise applications we do this while protecting and building on their legacy it investments our offerings include outsourcing and managed services systems integration and consulting services high-end server technology cybersecurity and cloud management software and maintenance and support services unisys has more than 23 000 employees serving clients around the world unisys offers a very competitive benefits package including health insurance coverage from first day of employment a 401k with an immediately vested company match vacation and educational benefits to learn more about unisys visit us at www unisys com unisys is an equal opportunity employer (eoe) - minorities females disabled persons and veterans #fed# at globant we dream and build digital journeys that matter to millions of users we do that by leveraging engineering design and innovation with our own industry-leading practices like our agile pods and specialized studios we want you to join us in creating these journeys for the biggest clients in tech retail travel banking ecommerce and media revolutionizing and growing their core businesses while helping them (and you!) stay ahead of the curve what are we looking for?we are looking for a business intelligence and data engineer who will be part of the team providing critical business insight planning and advisory services for our client you will join a data team with the mission to create an innovative metrics monitor for the client as an initiative to take advantage on the ever-growing computing capacity of the company for financial and logistics health check on a massive operation you will partner with software engineers program managers and program stakeholders to understand their needs and collaborate in the data manipulation and building of this novel product to be successful you will be an organized self-starter with a strong client-service orientation a passion for continual improvement and innovation and strong technical skills that enable you to create scale responsibilities: maintain and develop data metric definitions and reports for internal and external clients build dashboards using common data visualization technologies to provide a health monitor with real time data on financial and operational information build underlying data pipelines to manipulate data for reporting purposes present insights and findings in front of stakeholders and leaders of the organization provide reports for weekly status meetings with high visibility for c-level executives minimum qualification:ba bs degree with an emphasis on work of a quantitative nature (statistics computer science engineering mathematics) 5+ years of proven experience working on data manipulation and or building dashboards strong sql skills and bi knowledge on data analysisdemonstrated ability to understand new datasets and data structures preferred qualification:experience analyzing large datasets with strong statistical quantitative modeling and forecasting skills proficient in at least one scripting language for automation purposes (python bash among others)we are interested in hard-working fast-learning talents and we have the know-how and scale to help you make your own career path if you seek an entrepreneurial flexible and team-oriented culture come join us we are ready company descriptionark solutions inc is a privately held illustrious sap partner committed to providing your business with tangible solutions that improve your company’s performance and profitability we recognizes dynamic business changes and understand rapidly evolving technologies to deliver winning business outcomes through our deep industry experience of transforming business through technologyjob descriptionjob title: data stewardlocation: austin txduration: 12+ months• the data steward position is an information technology (it) position and will be a part of the enterprise data governance (edg) team working in collaboration with the center for analytics and decision support (cads) program within the health and human services system’s business domain the mission of edg is to ensure hhs system data is visible accessible where appropriate understandable in context trusted by users and uniformly governed in order to advance data quality and data analytics this enables data-driven decision making across the hhs system • the data steward performs senior-level business process analysis data analysis and assistance for hhs programs on system-wide data initiatives for the center for analytics and decision support’s enterprise data governance initiative the data steward's responsibilities may include:• establishing and maintaining a robust outreach program to hhs programs and business process experts• socializing concepts best practices and encouraging a culture that fosters each individual in the organization’s roles and responsibilities with respect to data quality• working with it project management staff to ensure enterprise information management (eim) requirements are implemented in new or enhanced systems• analyzing business processed to determine how data is generated used or exchanged• identifying opportunities to reuse data or business processes to achieve hhs program objectives and ultimately facilitate data-driven decision making through advancing the consistency and quality of hhs data• other duties as assigned minimum requirements:years skills experience 8 experience performing data and process analysis to identify and assess errors and anomalies and determine root cause; works with data owners to correct data generation collection and business processes 8 previous experience analyzing and standardizing data in large eim and master data management projects and teams 6 previous experience documenting data processes business and technical metadata and preparing reports 4 has experience identifying analyzing and resolving complex data issues across multiple environments 4 experience with data and dimensional modeling tools 4 experience with informatica or other etl tools to perform data reconciliation data standardization data cleansing risk data warehousing and data mapping analysis for data translation activities strong communication analytical and interpersonal skills at all levels strong ability to work on multiple projects or project assignmentspreferred: years skills experience 2 previous health and human services industry experience experience working with texas medicaid management information system (mmis) data and texas integrated eligibility redesign system (tiers) a plus bachelor's degree in computer science systems engineering or equivalent experiencequalificationsnulladditional informationall your information will be kept confidential according to eeo guidelines job number: r0021934 booz allen hamilton has been at the forefront of strategy and technology for more than 100 years today the firm provides management and technology consulting and engineering services to leading fortune 500 corporations governments and not-for-profits across the globe booz allen partners with public and private sector clients to solve their most difficult challenges through a combination of consulting analytics mission operations technology systems delivery cybersecurity engineering and innovation expertise data architect seniorkey role: integrate manipulate and manage vast amounts of data into the next generation of big data analytic solutions for clients combine engineering expertise with innovation to deliver robust solutions that serves our clients and stands apart from our competitors interact with a multi-disciplinary team of analysts data scientists developers and users to comprehend the data requirements to develop a robust data processing pipeline that will ingest manipulate normalize and expose potentially billions of records per day to support advanced analytics leverage expertise in programming including clean coding habits attention to detail and focus on quality and collaborate with and contribute to open source software communities by ensuring quality delivery of software through thorough testing and reviews architect build and launch new data models that provide intuitive analytics to customers design build and launch extremely efficient and reliable data pipelines to move both large and small amounts of data to the data platform design and develop new systems and tools to enable people to consume and comprehend data faster and identify new technologies to be injected into the platform to support advanced data integration and analysis basic qualifications: -5+ years of experience with dimensional data modeling and schema design in data warehouses -3+ years of experience with hands-on software design implementation and testing -2+ years of experience with custom or structured etl design implementation and maintenance -experience in working with either a map reduce or equivalent system on any size or scale and batch and streaming frameworks including storm nifi apex or flink -experience with storage components including accumulo hbase or hive and search technologies including solr and elasticsearch -experience with developing service apis for external consumption -knowledge of restful services design development and testing -active secret clearance required -bs degree in cs additional qualifications: -experience with multiple data modeling concepts including xml and json -experience with rdbms data stores including oracle and mysql -experience with machine learning or deep learning concepts and algorithms -experience with devops methods and tools including jenkins git svn docker or vagrant -experience with large-scale distributed systems design and development including scaling performance and scheduling -knowledge of at least one scripting language including python node ruby or bash -knowledge of system architecture including process memory storage and networking management preferred -ability to exhibit clean coding habits attention to detail and focus on quality -ability to learn technical concepts quickly and communicate with multiple functional groups -possession of excellent analytical and problem solving skills -ability to obtain security+ or cissp certification clearance: applicants selected will be subject to a security investigation and may need to meet eligibility requirements for access to classified information; secret clearance is required integrating a full range of consulting capabilities booz allen is the one firm that helps clients solve their toughest problems by their side to help them achieve their missions booz allen is committed to delivering results that endure we are proud of our diverse environment eoe m f disability vet sig2017 dateposted: 20010101 department: design engineering - provided by dice algorithms analysis architecture bash cissp consulting data modeling development git jenkins management modeling mysql mysql networking oracle programming python ruby scheduling security solr svn system architecture testing xml tmci is currently seeking a motivated team player with excel vb sql and data manipulation modeling and programming skills to join our team this full time position with benefits is located in the washington dc metro area (northern virginia) description:in this role you will be responsible for maintaining and updating a complex set of multi-dimensional models and data analysis manipulation tools used to support a major tmci client your activities will include:· becoming the go-to team member and chief toolsmith for our automated models · interacting with the subject matter experts on the team to document technical requirements user stories and design specifications for tool modification and enhancement efforts· coding testing debugging and documenting in accordance with tmci standards · implementing new and enhanced solutions designed to access and manipulate client data efficiently- creating data structures and index strategies to support independent client use of data analysis and visualization tools such as tableau applied against data sets created and extracted from within our models and tools - creating extraction transformation load tools as required to support client use of our datasetsyou will also work with our data scientists data analysts data subject matter experts and corporate leadership to identify new requirements and build new toolsets qualifications: · us citizenship is required· proven experience in logical database concepts sql query structure systems analysis design and total comfort as a hands-on programmer· relevant experience with complex excel imbedded sql multidimensional models and data design and manipulation is required· professional experience with sas or equivalent is a significant plus· professional experience with perl is a significant plus· experience with tableau is a significant plus· prior experience with dod va or commercial health clinical and or medical information systems is a plus· bachelor’s degree is desired relevant technical certifications are a plus this is an excellent opportunity for an individual with three or more years of relevant experience ready to confidently take the next step in their career we also welcome interest from mature professionals with a desire to be a hands-on participant as well as a mentor to the team in the data programming and manipulation area for more information about our company please visit www tmci-international com our financial client is looking for a data model architect to join their rapidly growing company the data model architect is responsible for developing conceptual logical and physical data models for data marts this role will establish data modeling best practices and provide data leadership to the it and business intelligence teams the data model architect will support new and existing system development and maintenance project efforts providing architectural guidance for all analytics solutions requirements:years of experience: 7-10+experience with data modeling including reverse and forward engineeringidentifies database requirements working closely with business intelligence it and operations departments; analyzing department applications programming and operations; evaluating existing systems and designing proposed systems work closely with it department to build a roadmap for database systems architecture and coordinate current and future direction of systems and advise on how that direction will impact the enterprise partners with business and technology subject-matter experts to elicit and translate business requirements into technological solutionsability to think through multiple alternatives and select the best possible solutions for strategic and tactical business needsrecognize and identifying potential areas where existing data tables and architecture require change or where new tables need to be developed with awareness of interrelated systems and data perform database design reviews and recommend solutions considering integrity performance standards conformance security recovery extensibility and flexibility ensure that new database code meets standards for readability reliability and performancecreate database objects and write database procedures functions triggers and utilitiesmaintains database performance by calculating optimum values for database parameters; implementing new releases; completing maintenance requirements; evaluating computer operating systems and hardware productsmonitor database performance and perform database tuning as well as capacity planningadvises developers of the most efficient database designs including tables datatypes stored procedures and functionswrites stored procedures and sql code to implement pre-defined business rules and metrics along with ensuring data quality in data warehouse processesproficient in working with etl toolsetsmaintains working data dictionary for reference by technology and business usersexpertise with mapping data from source to targettroubleshoot and correct issues with existing bi solutions using database best practicesresearches leading edge capabilities to determine the best solution fit for the overall environmentwork closely with cross functional business teams to establish governance framework policies and procedures for data management and monitor compliancereview regulatory audit and validation findings to risks impacting data governancefacilitate the data risk assessment process and the development and implementation of remediation steps to mitigate identified risksdevelop and manage process to resolve exceptions to established data governance polices and standardsdesign and build data governance processes standards enterprise data models develop a detailed knowledge of the underlying data and become the subject matter expert on content current and potential future uses of data and the quality knowledge of system and software engineering architecture principles concepts and best practicesexperience in coordinating technology designs with architectural requirements and constraintsexperience working with technologies such as: microsoft bi stack including sql ssis ssas ssrs oltp windows server apache tomcat iis t-sql sftp tableau visual basicstrong written and oral communication skills and ability to articulate and document data models as well as other related documentation effectively communicates findings with both technical and non-technical team members proven ability to work effectively in a fast-paced environment as part of a high-performing teamdetail oriented with strong analytical problem-solving time management and organizational skillsable to multi-task and follow through on assignments to completionability to prioritize work with tight schedules and target datesbachelor’ s degree in computer science management information systems or related fields or equivalentnice to have skills and experiencefamiliarity with tfs gitwillingness to learn new systemsmortgage industry experience computers software | pensacola fl title: database engineer location: pensacola fl emp status: regular full-time primary function solutions3 llc based out of mahwah nj is seeking a junior to mid-level database engineer to support a contract in pensacola fl in this role the qualified candidate will provide technical expertise for database design development implementation information storage and retrieval data flow and analysis you will develop relational and or object-oriented databases database parser software and database loading software you will be responsible for developing a database structure that fits into the overall architecture of the system under development and has to make trades among data volumes number of users logical and physical distribution response times retention rules security and domain controls you will work primarily at the front end of the lifecycle-requirements through system acceptance testing and initial operational capability (ioc) you will develop requirements from a project’s inception to its conclusion for a particular business and information technology (it) subject matter area (i e simple to complex systems) you will assist with recommendations for and analysis and evaluation of systems improvements optimization development and or maintenance efforts you will translate a set of requirements and data into a usable document by creating or recreating ad hoc queries scripts and macros; updating existing queries creating new ones to manipulate data into a master file; and building complex systems using queries tables open database connectivity and database storage and retrieval while using cloud methodologies essential skills & responsibilities • zero (0) to six (6) or more years of configuration management experience • bachelor’s degree in computer science mathematics statistics or a related field four (4) years of experience may be substituted for a degree • master’s degree in a related discipline may substitute for two (2) years of experience four (4) years of experience (for a total of six (6) or more years) may be substituted for a degree • ph d may substitute for four (4) years of experience eight (8) years of experience (for a total of fourteen (14) or more years) may be substituted for a degree • dodi 8570 1 compliance at iat level i certification • apache hadoop postgressql mysql vmware oracle dbms knowledge or sql server and its tools including the facets of successfully administering a wide range of simple to highly complex environments • experience with data and schema design and engineering • demonstrated practical experience with data migration from legacy systems to central repositories • industry standard exchange schema implementation experience (e g cybox or capec) • be able to evaluate and install new software releases patches and system upgrades • knowledge and understanding of all aspects of database tuning: software configuration memory usage data access data manipulation sql and physical storage • experience supporting technology strategy roadmap • experience with development and execution of database security policies procedures and auditing-experience with database authentication methods authorization methods and data encryption techniques • supervises development of databases database parser software and database loading software • coordinates development of database structures that fit into the overall architecture of the system under development • assesses requirement recommendations from a project’s inception to its conclusion for a particular business and it subject matter area (i e simple to complex systems) • leads development of databases database parser software database loading software and database structures that fit into the overall architecture of the system under development • develops requirement recommendations from a project’s inception to its conclusion for a particular business and it subject matter area (i e simple to complex systems) • possesses excellent oral and written communication skills • must work well in a team environment as well as independently • must exhibit good time management skills independent decision making capability; focus on customer service • ability to work with the other technical members of the team to administer and support the overall database and applications environment preferred but not required: • the following certifications are desired: cloudera certified professional (ccp): data scientist ccdh: cloudera certified developer for apache hadoop ccah: cloudera certified administrator for apache hadoop ccshb: cloudera certified specialist in apache hbase & csslp certified secure software lifecycle professional • understanding of certification and accreditation (nist 800-53) processes as they apply to database technologies • experience with map reduce technologies • experience with process development and deployment • trained in six sigma methodology • itil knowledge and certification • experience database engineering support to dhs dod or intelligence customers • data scientist skills and experience • operating system and hardware platform knowledge preferred • experience working with large unstructured data sets clearance: • u s citizenship • active top secret sensitive compartmented information (ts sci) security clearance solutions3 llc is an equal opportunity employer and will not discriminate against any employee or applicant on the basis of age color disability gender national origin race religion sexual orientation veteran status or any classification protected by federal state or local law title- marklogic developer (nosql)location- los angeles caduration-6 months contractinterview mode: face to face primary responsibilities:develops and deploys xquery modules and rest apis to support complex searches and queries against the marklogic repositorydevelop and maintain content ingestion validation and transformation workflows that populate the departments marklogic xml repositorydevelops and deploy xquery modules and rest apis to support complexsearch’s and query against the mark logic content repositorydevelop and deploy data reports aggregating all available data sources through custom web applications research and develop pocs that demonstrate cutting- edge solutionsemploying established data science and search methods such as machine learning semantic enrichment nlp rdfs and semantic search recommender algorithms etc and optimize these for integration with production systemscontribute to the definition and implementation the enterprise master data management architecture required qualifications: bachelor’s degree in computer science mathematics engineering or similar discipline or equivalent experience3+ years of marklogic 8 9 development & architecture experienceable to architect and implement marklogic from scratch (data modeling as well)able to code xquery in marklogicnosql experience (3+ years) - provided by dice marklogic and nosql **requisition number** 18\-0137**title** big data architect**city** open**description** this role is with forsythe a sirius company\ as a nationally recognized it solutions provider with an over 35\-year history of success sirius is known for cultivating the best talent providing a positive work environment and offering a compensation and benefits package designed to help our employees thrive both personally and professionally\ we deliver best\-of\-breed it solutions from the world's top technology innovators including checkpoint cisco citrix dellemc f5 fireeye hds hp ibm netapp nutanix palo alto vmware and many more\ if you want to work with the brightest minds in the business contact us today\ position summary:the big data architect is an accomplished leader in analytics data engineering architecture governance and framework design and someone who guides teams to success\ the big data architect approaches business problems with creative thinking and is focused on using the best forefront of next generation technology to address complex value added real\-world problems\ primary function: support presales for big data opportunities\ scope and execute big data projects for customers\ support ip development for the practice\ primary duties & responsibilitiesdelivery:• define client needs and oversee project milestones to ensure expectations timelines and budgets are met• define data platform architecture and design have hands on capability to review code and make required changes• responsible for the overall quality of project deliverables and the successful implementation of defined solution for the customer• identify and qualify follow\-on opportunities and engage senior leadership• establish procedures and recommend changes to policies that have a positive impact on the organization\(s\) and or implementation team\ client development:• occasionally interact with senior level management at client site or within the company which involves negotiating or influencing others on matters of significance• build long\-term superior client relationships and proactively manage client expectations and ensure that change control is used when scope boundaries are exceededprofessional development:• maintain a strong network and promote the organization at various meetings forums panels publications and conferences\ begin to establish thought leadership in the industry• maintain technical certifications and attend training sessions to refine technical skills• responsible for oversight and apprentice training of junior resources as assigned in the field\ • security is every employee’s responsibility; if you are aware of a security related vulnerability or non\-compliance with the information security policy or employee handbook you must report it to the corporate security team human resources or a member of senior management\ • participate at hire and annually in the information security awareness training as well as other required training identified by the human resources department\ other data privacy and data security related regulatory training may be required based on your role or assignment\ **requirements** basic qualifications \-• bachelor's degree in computer science information technology engineering or a related field• at least eight \(8\) years of information technology work experience to include all of the following:• ability to deliver end to end advanced analytics strategy ranging from architecting open source based big data\-analytic tools and building integrated solutions that meet emerging enterprise needs• hands\-on experience with various forms of data design such as oltp olap ods edw \(3rd normal\) bi data marts \(star and snow flake schemas\) dss with knowledge of various bi tools and data management tools\ • at least six \(6\) years of experience in structured semi\-structured and un\-structured data with progressive responsibilities in data warehousing and business intelligence data architecting and analytical modeling techniques\ • at least four \(4\) years of experience in big data technologies in hadoop ecosystem – such as hive hdfs mapreduce yarn kafka pig oozie hbase sqoop nifi and or ranger\ other position requirements \-• experienced in customization and optimization of anyone of the big data distributions: mapr cloudera or hortonworks \(hdp and hdf\)• expertise in distributed columnar mpp nosql & document storage dbs in depth and scale\ expertise in architecting designing and developing big data frameworks for real time and batch analytics and ability to do capacity planning for large enterprise with petabyte scale\ • proven coding experience in anyone of the following programming language scala python or r and advanced knowledge in scala python frameworks\ • proven knowledge in spark 2\ 2 framework\ • demonstrated ability to configure and optimize and fine\-tuning clusters for various big data moving parts like kafka nifi spark and hdfs\ • knowledge in big data multi data center distribution \(active active or active passive\) configuration and disccp• interpersonal skills to be able to work with data scientists and covert legacy machine learning models to spark ml batch jobs using pysparkpreferred qualifications:• knowledge in hdfs cassandra or mongodb• knowledge of any one of the following cloud big data offerings: azure hdinsight or aws emr• master’s degree in computer science or data analytics• integration of big data eco system with advanced ml products like ayasdi or linquamatics or data visualization products• knowledge of integrating pyspark with native anaconda libraries deep learning libraries like keras or tensor flowessential functionsthe position exists to provide technical consulting solutions to customers and as such requires the ability to travel to and from customer sites and interact with customers on an ongoing and regular basis\ the above primary duties responsibilities and position requirements are not all inclusive\ sirius is an equal opportunity employer that values diversity\ as a government contractor sirius takes affirmative action to employ and advance in employment qualified women minorities individuals with disabilities and protected veterans; maintains a drug\-free workplace; and participates in e\-verify\ individuals who receive job offers will be required to complete pre\-employment screening that includes a background check verifying name residences education work experience and criminal convictions consistent with the fair credit reporting act; and a drug test for controlled substances consistent with the drug\-free workplace act and the americans with disabilities act\ sirius will not sponsor work eligibility for this position\ our client is made up of a richly diverse group of talented professionals the company is a well respected fortune 250 recognized brand and industry leader located in nashville tn the company offers competitive compensation and benefits packages flexible work schedules excellent career advancement opportunities and a collaborative team-oriented work environment responsibilities:work with team leaders and business users clients to provide analytics solutionsrecommends ongoing improvements to system design and work flow processlead efforts for system implementations and enhancementscreate data monitoring models for productsprovide insight into the changing database storage and utilization requirements and offer suggestions for solutionsdevelop database design and architecture documentationmaintain integrity and security of database environmentrequirements:bachelor' s degree in computer science engineering or related field5+ years experience in data architecturebackground working with clients to analyze problems and deliver custom-built solutionspreferred technical experience:tableauhadooppostgresqloracleabout jdc group jdc group is a technology recruiting and staffing company based in atlanta georgia founded in 2005 we connect exceptional technology talent with successful companies all over north america to form productive teams we were voted one of atlanta’ s best and brightest companies to work for® and named one of inc 5000’ s fastest-growing private companies in america to learn more about us please visit www jdc-group com title : corporate data modellocation : washington dcduration : contract corporate data modelling exp is mustjob requirement-corporate data model implementation;-verbal and written communication;-data modeling;metadata frameworks;-metadata tools such as information metadata manager-collibra;-setup and maintain team sharepoint site;-selection criteria:-seeking a highly motivated individual who will bring the following qualities and capabilities to this position:-bachelor’s degree in computer science information science management information science or statistics or other data intensive discipline-knowledge of corporate data model and participated in building of cdm-1-2 years of experience in setting up of sharepoint group sites-at least four (4) years of experience in data modeling and data analysis-at least two years of defining and building metadata-hands-on experience in working with meta data tools (preferred working experience in collibra or similar tools) and informatica metadata manager-professional certification in one or more areas related to information architecture information management information governance and business intelligence-strong communication skills (written and oral)-experience facilitating issue resolution sessions-ability to describe data mapping and data migration techniques-demonstrated hands-on experience developing entity relationship models and data flow diagrams-ability to export lists of data elements from erwin models-ability to extract lists of included data elements from informatica powercenter etl feeds and databases leveraging pl sql-demonstrated hands-on experience in analyzing large databases-competent in office technology tools with advanced excel and access skills-data mining skills (including data auditing aggregation validation and reconciliation) are a plus-competencies-business judgment and analytical decision making - analyzes facts and data to support sound logical decisions regarding own and other;-compliance with standards - monitors and maintains records on requests for information and assistance;-information seeking and bias for action - gathers and analyzes sufficient information to meet requirements from across ifc innovate on solutions and respond quickly and effectively to clients;-client orientation - takes personal responsibility and accountability for timely responses to client queries requests or needs working to remove obstacles that may impede execution or overall success;-teamwork (collaboration) and inclusion - collaborates with other team members and contributes productively to the team's work and output demonstrating respect for different points of view;-drive for results - takes personal ownership and accountability to meet deadlines and achieve agreed-upon results and has the personal organization to do so;-knowledge learning and communication - actively seeks out and shares knowledge to complete assignments and disseminate learning and innovation to benefit the organization overviewwe are seeking a data engineer with experience working with amazon web services (aws) in this role you will support analytics efforts across multiple clients including but not limited to an automotive manufacturer and grocery retailer responsibilitieslead data management lifecycle with an emphasis on extract transform and load (etl) process warehousing cleansing and quality assurance (qa)ability to automate data integration processes as required (api ftp s3 etc )scope design and deliver flexible and scalable data solutions for a variety of internal and client needswork closely with analytics it and legal teams to design and deploy security controls based on required policies and standards (soc 2)develop technical requirement standards and documentation to support future workflowsimprove system performance through environment upgrades and improvementsstay current with latest strategies tools and best practices to improve speed efficiency and stability of platformsqualificationsbachelor's degree in computer science information systems information technology or other relevant course of study5+ years experience with aws environment or another leading cloud providerexperience with architecting big data platforms: elastic mapreduce (emr) spark hadoop etc functional understanding of common scripting and query languages (python sql etc )strong understanding of industry best practices for developing and operating in the cloudability to write clean and maintainable codestrong documentation and diagramming skillsmust be organized detail-oriented flexible and able to prioritize against short deadlinessmart collaborative highly driven and solutions oriented; able to proactively address emerging problems or take advantage of new opportunitiescomfortable working independently developing new process and making recommendationsaws certifications are a plus (devops engineer solutions architect and or sysops administrator)experience with aws lambda athena redshift and or glueexperience working with business intelligence tools (tableau) and statistical analysis software (r sas etc )ability to understand interpret and explain complex data solutions to internal and client leadership teamsexperience working with an advanced analytics or data science teamprevious marketing or business understanding **principal software engineer - oracle data cloud****preferred qualifications****senior java engineer - data cloud**_"the addthis odc engineering is solving data challenges at huge scale (data sets in the billions!!) that is very unique in the industry and certainly the dc area it's a fast-paced start-up culture with the stability & backing of a top software company _ " yuesong wang - director development - oracle data cloud(odc)https: github com addthis hydraodc engineers are responsible for building robust distributed systems and infrastructure that process massive amounts of data execute machine learning algorithms at scale and make the insights derived from these processes available to millions of website publishers and billions of users in near real-time as the **senior java engineer** on the platform team your job will be to build and maintain the systems and algorithms that make this possible our software stack includes many best-of-breed open source technologies such as cassandra and kafka as well as home grown tools some of which have been open sourced such as hydra (checkout our github page) in addition to building the distributed systems that enable data processing at scale you will work with our data scientists to implement machine learning algorithms that are able to efficiently operate in a distributed environment data is at the heart of what we do at addthis and your work will be a critical factor in our success responsibilities:- build and maintain high-performance distributed systems- design and implement highly scalable api's and services that receive billions of requests per day- be fanatical about performance and performance monitoring- use efficient data structures and algorithms to enable data processing at scale- code primarily in java but also able to use the right language for the right jobrequirements:- extensive experience with distributed computing performance analysis network protocols data storage subsystems and linux- strong computer science fundamentals including a deep understanding of data structures and distributed algorithms- able to create elegant efficient and testable code- expert java programmer with a deep understanding of the jvm the java memory model and asynchronous i o - bs ms or phd in computer science or related fielddesired:- experience processing very large data sets- experience working with advertising systems- ability to understand and implement machine learning algorithms- experience with cassandra kafka hadoop spark riak or similar technologies- experience working with docker mesos sdc or similar technologies- familiar with common posix and linux specific system calls- understanding of linux kernel development- experience developing high performance software that operates on ssds**detailed description and job requirements**design develop troubleshoot and debug software programs for databases applications tools networks etc as a member of the software engineering division you will take an active role in the definition and evolution of standard practices and procedures you will be responsible for defining and developing software for tasks associated with the developing designing and debugging of software applications or operating systems work is non-routine and very complex involving the application of advanced technical business skills in area of specialization leading contributor individually and as a team member providing direction and mentoring to others bs or ms degree or equivalent experience relevant to functional area 7 years of software engineering or related experience **oracle is an equal employment opportunity employer all qualified applicants will receive consideration for employment without regard to race color religion sex national origin sexual orientation gender identity disability and protected veterans status or any other characteristic protected by law ****job:** product development**location:** us-va virginia-vienna**other locations:** us-va virginia-reston**job type:** regular employee hire**organization:** oracle **title:** director data governance lead**location:** united states-connecticut-hartford**job number:** 1800294**director data governance lead –****enterprise data trust team enterprise data office****the hartford****hartford ct**the candidate will lead the hartford’s enterprise data trust team within our data governance organization the organization of approximately 15 teammates is empowered to drive tangible business data value through the delivery and adoption of enterprise governance data policies standards services & supporting technologies the role will work very closely with line of business teams to motivate business data ownership and improve the quality of critical data across the enterprise **responsibilities:**+ lead the enterprise data trust organization which includes accountability for:+ data governance & principles+ data quality control & certification+ metadata management+ data knowledge management+ enterprise data stewardship strategy+ enterprise master data management strategy+ define and execute the enterprise data trust strategy (18 month plan) with recurring milestones and appropriate expectation management with executive sponsors+ manage and influence enterprise data quality through the recurring certification of enterprise data assets and the reporting of data asset health in the enterprise data scoring process+ facilitate the enterprise data oversight committee with senior executive leaders to:+ maintain data governance strategy+ define data quality priorities+ manage data certification results & enterprise scorecard+ promote business data ownership+ manage data policy compliance+ resolve data conflicts of enterprise data issues across lines of business+ enterprise data trust advocate and change champion+ enterprise champion and escalation support for line of business data stewardship teams+ partner and support individual line of business data governance councils including:+ commercial lines+ personal lines+ claims+ group benefits+ and others+ define and promote data quality standards controls and measures+ enterprise champion to develop and grow enterprise data knowledge – both employee proprietary knowledge and written documentation+ identify leading capabilities within lines of business to be shared and exploited across the enterprisequalifications:+ bachelor’s degree or master’s degree in relevant field + 5 years of experience leading enterprise data governance quality initiatives across large complex organizations + 10 years of data management experience such as bi reporting data stewardship data quality management metadata management data governance data process ownership and or data manufacturing + working understanding of industry best practices & technologies that effectively govern and manage data from various perspectives (data governance curation preparation stewardship analysis & or reporting) + relationship management execution & delivery+ proven experience partnering with business intelligence analytical actuarial & or data science communities+ proven ability to simplify technology and data concepts for business stakeholders to help drive adoption+ excellent written verbal communication and presentation skills with ability to effectively communicate with senior management + big picture+ actionable plan to execute+ results oriented with the demonstrated ability to apply strategic and decisive problem solving skills in ambiguous situations+ strong analytical critical thinking and problem solving skills + leadership+ strong leadership and influencing skills at the senior management level+ build talent within the team through coaching opportunities and team growth by fostering an environment that is a destination for talent+ proven ability to create a high performing team that has a culture of continuous learning collaboration and is focused on business value and outcomes+ build commitment and empower others communicate with clarity courage and timeliness + p&c insurance experience preferred**behaviors at the hartford**+ _deliver outcomes_ _– demonstrate a bias for speed and execution that serves our shareholders and customers _+ _operate as a team player_ _– work together to drive solutions for the good of the hartford _+ _build strong partnerships_ _– demonstrate integrity and build trust with others _+ _strive for excellence_ _– motivate yourself and others to achieve high standards and continuously improve _**_role will be filled at the director or avp level depending upon candidate experience _**_equal opportunity employer females minorities veterans disability sexual orientation gender identity or expression religion age__** no agencies please **__job: business data analysis__primary locationunited states-connecticut-hartford__other locations__schedulefull-time__job leveldirector__education levelbachelor's degree (±16 years)__job typestandard__shiftday job__employee statusregular__overtime statusexempt__travelyes 10 % of the time__postingjan 26 2018 1:56:46 pm__remote worker optionno__the hartford is an equal employment and affirmative action employer all qualified applicants will receive consideration without regard to race color sex religion age national origin disability veteran status sexual orientation gender identity or expression marital status ancestry or citizenship status genetic information pregnancy status or any other characteristic protected by law the hartford maintains a drug-free workplace and is committed to building inclusion and leveraging diversity _ tiffany & co is seeking an it director - data engineering in parsippany nj as part of our enterprise information management team this position will lead the design and development of data and analytics solutions on an aws hosted platform build and motivate a high performing team of data engineers advise and inform the executive team regarding technology solutions that can maximize business value and provide necessary leadership to establish the strategic framework for success this leader will need to have strong collaboration skills to promote cross-functional partnerships with architecture infrastructure testing and other shared services teams and champion change while delivering measurable results the it director - data engineering will be the team leader and technical subject matter expert including mentorship of engineering team via architecture design reviews code reviews etc s he will have overall responsibility for technical development and support related to tiffany’s enterprise data hub (edh – aws-based data management platform) and enterprise data warehouse (on prem edw) the position will develop and implement a strategy for migrating key data and functionality from our legacy enterprise data warehouse solution to our enterprise data hub so that the edw can be retired while developing strategic technology plans linked to key business objectives the it director - data engineering will establish the technical direction for the team working closely with implementation and support partners driving the necessary changes and making appropriate technology choices while working collaboratively with the architecture team s he will collaborate with business and technical stakeholders to define technical requirements and design based on the business requirements s he will champion a shift towards a devops mentality emphasizing continuous integration release management and automated testing to maximize development agility and improve time to market this role will communicate appropriately and manage relationships with external vendors to determine technical competence and identify integration opportunities bachelor’s degree and or equivalent experience in information technology computer science or related areaa minimum of 5 years of experience in an advanced analytics engineering rolea minimum of 8 to 10 years of experience in managing complex it projects and overseeing internal and external resourcesa minimum 5 years of experience managing software development teams and delivering solutions on a cloud platform preferably on aws with hadoop hive redshift etc an understanding of large scale computing solutions including software design and development database architectures ip networking security and performance tuningknowledge of cloud security orchestration management strong communication skills include communicating effectively across internal and external organizationsexperience working a global retail environment preferredexperience with apache spark and or pyspark and real time streaming platforms such as aws kinesis or apache kafka storm is idealexperience with cdc technologies such as ibm cdc informatica power exchange etc is preferredexperience with big data etl technologies such as informatica bdm talend snaplogic matillion etc is ideal #l1-post7