SME - Global Payroll Solutions
CrowdStrike
As a global leader in cybersecurity, CrowdStrike protects the people, processes and technologies that drive modern organizations. Since 2011, our mission hasn’t changed — we’re here to stop breaches, and we’ve redefined modern security with the world’s most advanced AI-native platform. Our customers span all industries, and they count on CrowdStrike to keep their businesses running, their communities safe and their lives moving forward. We’re also a mission-driven company. We cultivate a culture that gives every CrowdStriker both the flexibility and autonomy to own their careers. We’re always looking to add talented CrowdStrikers to the team who have limitless passion, a relentless focus on innovation and a fanatical commitment to our customers, our community and each other. Ready to join a mission that matters? The future of cybersecurity starts with you.
About the Role:
CrowdStrike is seeking an expert SME Global Payroll Solutions with Alteryx & Python Developer skills to transform our payroll data operations through comprehensive analytics automation. This role will leverage both the Alteryx platform and Python programming to design scalable, efficient solutions that convert complex Payroll/Finance data into actionable business intelligence. The ideal candidate will create end-to-end automation solutions, from data extraction and transformation to advanced analytics and user-friendly applications for non-technical stakeholders, with strong expertise in Snowflake data lake/warehouse architecture and management.
What You’ll Do:
Architect end-to-end data workflows using Alteryx Designer, Server, and Python Implement automated Payroll/Finance reporting solutions with scheduled execution Design and optimize data preparation processes for accuracy and efficiency Develop Python scripts for complex data processing, API integrations, and custom automation Create self-service analytics applications for Payroll/Finance team consumption Establish data governance standards within automated workflows Build robust error handling, logging, and monitoring systems Provide technical leadership on Alteryx and Python best practices Design, implement, and maintain Snowflake data lake/warehouse infrastructure for payroll data storage and retrieval Establish data governance, security, and access control policies within Snowflake Optimize Snowflake performance and cost efficiency
What You’ll Need:
Alteryx Designer Expertise
4+ years' experience with complex workflow development
-
Advanced proficiency with:
Data preparation tools (Join, Union, Filter, data parsing, and Formula)
Spatial analytics capabilities
Predictive tools and statistical analysis
Regular expressions and string manipulation
Iterative macros and batch processing
In-database processing techniques
API integration tools
Python tool integration within Alteryx workflows
Snowflake connectors and integration tools
Data lake connectors (optional for AWS/Azure)
Python Development Skills (Required)
3+ years' experience with Python for data automation and analytics
-
Core Python Libraries:
Pandas - Advanced data manipulation, transformation, and analysis
NumPy - Numerical computing and array operations
Openpyxl/XlsxWriter - Excel file generation and manipulation
SQLAlchemy - Database connectivity and ORM
Requests - API integration and web services
Schedule/APScheduler - Task scheduling and automation
Logging - Comprehensive error tracking and monitoring
Snowflake Connector for Python - Direct Snowflake integration and data operations
Snowflake SQLAlchemy - ORM support for Snowflake
Boto3 - AWS SDK for Python (optional - S3, Glue integration with Snowflake)
Azure SDK - Azure integration (optional)
PyArrow/Parquet - Efficient columnar data format handling
-
Automation & Integration:
Building automated data pipelines and ETL processes
REST API development and consumption
Email automation (SMTP, email notifications)
File system operations and automated file handling
Integration with cloud storage (AWS S3, Azure Blob Storage)
Web scraping when necessary (BeautifulSoup, Selenium)
Snowflake data ingestion pipelines and orchestration
Snowpipe for continuous data loading
Streaming data integration into Snowflake (optional)
-
Data Processing:
Complex data transformations and business logic implementation
Data validation and quality assurance frameworks
Handling large datasets efficiently
Multi-threaded/asynchronous processing
Custom calculation engines for payroll metrics
Snowflake-optimized query patterns and best practices
Data clustering and partitioning strategies in Snowflake
Data compression and format optimization (Parquet, ORC, Avro)
-
Development Best Practices:
Object-oriented programming principles
Writing clean, maintainable, documented code
Unit testing (pytest, unittest)
Version control with Git
Virtual environment management
CI/CD pipeline experience (preferred)
Snowflake Data Platform Expertise (Required)
3+ years' hands-on experience with Snowflake data platform
-
Snowflake Architecture & Design:
Designing multi-layer data architectures in Snowflake (raw, curated, consumption layers)
Implementing medallion architecture (bronze, silver, gold layers)
Database, schema, and table design best practices
Understanding of Snowflake's unique architecture (storage, compute, cloud services)
Virtual warehouse sizing and configuration
Multi-cluster warehouse management
Data sharing capabilities (internal and external)
-
Snowflake Development:
Advanced SQL development in Snowflake
Stored procedures and user-defined functions (UDFs)
Snowflake scripting and task automation
Streams and tasks for CDC (Change Data Capture)
Dynamic tables and materialized views
Time travel and data cloning features
Zero-copy cloning for development/testing environments
-
Data Integration & Loading:
Connecting Alteryx and Python workflows to Snowflake
Snowpipe for automated data ingestion
COPY INTO commands and bulk loading strategies
External tables and stages (AWS S3, Azure Blob)
File format definitions and optimization
Incremental data loading patterns
Error handling and data validation during loads
-
Performance Optimization:
Query optimization and performance tuning
Clustering key design and maintenance
Search optimization service configuration
Result caching strategies
Warehouse auto-suspend and auto-resume configuration
Query profiling and execution plan analysis
Cost optimization through warehouse management
-
Security & Governance:
Role-based access control (RBAC) implementation
Row-level and column-level security
Data masking and tokenization policies
Network policies and IP whitelisting
Snowflake object tagging for governance
Audit logging and compliance monitoring
PII/sensitive data handling and protection
Data retention and time travel policies
-
Monitoring & Administration:
Snowflake resource monitors and alerts
Query history and performance monitoring
Credit usage tracking and optimization
Account usage views and reporting
Troubleshooting connectivity and performance issues
Backup and disaster recovery strategies
Work Location: Kharadi, Pune
Shift Timing : 2:00 PM to 11:00 PM IST
#LI-SA2
Benefits of Working at CrowdStrike:
Market leader in compensation and equity awards
Comprehensive physical and mental wellness programs
Competitive vacation and holidays for recharge
Paid parental and adoption leaves
Professional development opportunities for all employees regardless of level or role
Employee Networks, geographic neighborhood groups, and volunteer opportunities to build connections
Vibrant office culture with world class amenities
Great Place to Work Certified™ across the globe
CrowdStrike is proud to be an equal opportunity employer. We are committed to fostering a culture of belonging where everyone is valued for who they are and empowered to succeed. We support veterans and individuals with disabilities through our affirmative action program.
CrowdStrike is committed to providing equal employment opportunity for all employees and applicants for employment. The Company does not discriminate in employment opportunities or practices on the basis of race, color, creed, ethnicity, religion, sex (including pregnancy or pregnancy-related medical conditions), sexual orientation, gender identity, marital or family status, veteran status, age, national origin, ancestry, physical disability (including HIV and AIDS), mental disability, medical condition, genetic information, membership or activity in a local human rights commission, status with regard to public assistance, or any other characteristic protected by law. We base all employment decisions--including recruitment, selection, training, compensation, benefits, discipline, promotions, transfers, lay-offs, return from lay-off, terminations and social/recreational programs--on valid job requirements.
If you need assistance accessing or reviewing the information on this website or need help submitting an application for employment or requesting an accommodation, please contact us at [email protected] for further assistance.
