自訂 Cookie
禁止且拒絕未經各資訊當事人同意,擅自蒐集本服務提供的使用者個人資訊資料等資料之行為。即使是公開資料,若未經許可使用爬蟲等技術裝置進行蒐集,依個人資訊保護法可能會受到刑事處分,特此告知。
© 2025 Rocketpunch, 주식회사 더블에이스, 김인기, 大韓民國首爾特別市城東區聖水一路10街 12, 12樓 1號, 04793, support@rocketpunch.com, +82 10-2710-7121
統一編號 206-87-09615
更多
自訂 Cookie
禁止且拒絕未經各資訊當事人同意,擅自蒐集本服務提供的使用者個人資訊資料等資料之行為。即使是公開資料,若未經許可使用爬蟲等技術裝置進行蒐集,依個人資訊保護法可能會受到刑事處分,特此告知。
© 2025 Rocketpunch, 주식회사 더블에이스, 김인기, 大韓民國首爾特別市城東區聖水一路10街 12, 12樓 1號, 04793, support@rocketpunch.com, +82 10-2710-7121
統一編號 206-87-09615
更多


이재천
안녕하세요. 저는 프론트앤드 기술과 백앤드 기술과 devops, dataops 기술을 갖추고 있는 12년차 개발자입니다. 개발 언어는 javascript, python, java, php를 다룰 수 있습니다.(일본거주)
職涯
貼文
AI 職涯摘要
이재천님은 12년 차 개발자로, 프론트엔드와 백엔드, DevOps, DataOps 분야에서 폭넓은 경험을 쌓아왔습니다. 현재는 일본의 자동차 제조사를 위한 빅데이터 엔지니어로 활동하며 AWS, Kubernetes 환경에서 데이터 수집 및 처리, CI/CD 파이프라인 구축 등에 전문성을 보유하고 있습니다.
經歷
● Distributed Data Collection, Processing, and Modeling for Japanese Automotive Manufacturers -> Platform: AWS Kubernetes (EKS) -> Tools: Terraform, Helm -> CI/CD: Jenkins, ArgoCD -> Data Processing: Airflow, Spark -> Version Control: Git (AWS CodeCommit) -> Storage: AWS S3 -> Development Type: Individual Development 1. Data Collection Request: Request to collect data on manufacturers, models, launch years, grades, formats, engine types, and turbochargers for Japanese car companies (LEXUS, TOYOTA, NISSAN, HONDA, MAZDA, MITSUBISHI, SUBARU, DAIHATSU, SUZUKI). 2. Data Collection DAGs: Create data collection DAGs within the Airflow and Spark environment on EKS, customized for each manufacturer. 3. SparkApplication Configuration: Configure the SparkApplication to accept manufacturer information as parameters and set up AWS S3 integration to monitor driver and executor logs. 4. Application Deployment: Link the data collection Python script to the mainApplicationFile in the SparkApplication and verify deployment with ArgoCD using the latest built image. 5. Airflow Integration: In Airflow DAGs, invoke the SparkApplication deployed via ArgoCD, configure manufacturer parameters, and submit the Spark Application using SparkKubernetesOperator. 6. Monitoring: Monitor the Spark Application through ArgoCD and within EKS. After completion, verify the collected CSV data results in the Spark History Server and AWS S3. 7. Distributed Collection Design: Design the data collection to parallelize URLs by model (e.g., Toyota Prius) and create RDDs for distributed collection. After collecting, process and model the data, save it to CSV, and upload it to AWS S3. 8. Performance Metrics (Pre-Distributed Collection): Using a single AWS m5.xlarge server: a. Collecting 1,678 LEXUS data entries with Python and Selenium takes approximately 3 hours. b. Collecting 40,385 TOYOTA data entries takes approximately 72 hours. 9. Performance Metrics (Distributed Collection): Using two AWS m5.xlarge servers (1 driver, 4 executors) with Python, Selenium, and Spark: a. Collecting 1,678 LEXUS data entries takes approximately 45 minutes. b. Collecting 40,385 TOYOTA data entries takes approximately 18 hours. ● Elastic Search Engine Building and Operation (Backup/Recovery) -> AWS Kubernetes (EKS) -> Terraform, Helm -> Jenkins -> Git (AWS CodeCommit) -> Individual Development (Provisioning by Terraform and Jenkins) 1. Prepare a helm chart for the 7.10.1 version of Elasticsearch, which provides free versions such as 7.10 and earlier. 2. Update the values.yaml file in the helm chart to install the repository-s3 plugin in initContainers. 3. Create a Kubernetes secret for S3 to backup data in Elasticsearch pods. 4. Update the values.yaml file in the helm chart to set the keystore with the secretName for AWS authentication. 5. Prepare one Master Node and two Data Nodes. 6. Configure AWS, Kubernetes, and Helm providers in Terraform 7. Configure a service account with the EBS CSI driver role for Elasticsearch in Terraform. 8. Add the EBS CSI driver addon with the role to the Kubernetes cluster in AWS using Terraform to help provision Elasticsearch-related PVCs. 9. Propagate the AWS accessKeyId and secretAccessKey to Elasticsearch when the helm_release starts by Terraform 10. Set up a Jenkins pipeline to execute Terraform 11. Backup data in pods to S3 every day at 2 AM ● Building an Apache Airflow Workflow Environment -> AWS Kubernetes (EKS) -> Terraform, Helm -> Jenkins -> Git (AWS CodeCommit) -> Individual Development (Provisioning by Terraform and Jenkins) 1. Configured AWS, Kubernetes, and Helm providers in Terraform. 2. Configured service account with EBS CSI driver role for Airflow in Terraform. 3. Added EBS CSI driver addon with the role to Kubernetes cluster in AWS using Terraform, as it helps provision Airflow-related PVCs. 4. Created a Kubernetes secret for Git credentials to sync DAGs in Git. 5. Configured DAGs with gitSync and volume settings using the secret for Git credentials, allowing Airflow webserver, scheduler, and worker to sync DAG files from Git. 6. Created an ACM certificate for Airflow and set up Route 53 record in AWS. 7. Created ALB ingress for Airflow and set up Route 53 record in AWS. 8. Set up a Jenkins pipeline to execute Terraform. ● Building an Apache Spark Distributed Framework Environment -> AWS Kubernetes (EKS) -> Terraform, Helm -> Jenkins, ArgoCD -> Git (AWS CodeCommit) -> AWS S3, AWS ECR (Docker Storage) -> Individual Development (Provisioning by Terraform and Jenkins) 1. Configured AWS, Kubernetes, and Helm providers in Terraform. 2. Configured service account with roles for Spark Operator and Spark History Server in Terraform. 3. Created an AWS S3 bucket for analyzing Spark Application logs. 4. Set a variable for AWS S3 connection settings in Spark History Server (`s3a://spark-event-log/history`). 5. Modularized the Spark Operator and Spark History Server configurations in Terraform. 6. Set up a Jenkins pipeline to execute Terraform. (Docker File) 1. Created a Dockerfile for Spark 2. Application.Used the Docker image `spark:3.5.0-scala2.12-java11-python3-ubuntu`. 3. Downloaded and configured JAR libraries `aws-java-sdk-bundle-1.12.262.jar` and `hadoop-aws-3.3.4.jar`. These libraries enable uploading Spark Application logs to AWS S3 for analysis on the Spark History Server. (CI/CD Pipeline) - CD 1. Configured deployment of SparkApplication YAML file in ArgoCD with `apiVersions` as `sparkoperator.k8s.io/v1beta2`. 2. Set Spark application mode to cluster (standalone, client, and cluster modes available) in the YAML file. 3. Specified Spark application image location (`{account_id}.dkr.ecr.ap-northeast-1.amazonaws.com/spark-server:app-20240425-87`) in the YAML file. 4. Configured SparkConf for uploading Spark Application logs to AWS S3 in YAML. 5. Set driver and executor configurations for resources and AWS authentication information. - CI 1. Prepared Jenkins pipeline file. 2. Checked out gitops and spark application storage. 3. Performed AWS CLI Login authentication. 4. Built and Pushed with Docker CLI to AWS ECR. 5. Updated image version in Gitops storage. 6. Deployed Spark application to ArgoCD automatically as soon as the version is changed. 7. Ran the spark application using the spark operator. 8. Uploaded the related log to S3 when the Spark application completed, and the Spark History Server provided monitoring for log detection and analysis. ● Automating AWS Cloud Infrastructure with Terraform (Linux, Window) -> Terraform -> Jenkins -> Individual development 1. Utilized Terraform and Jenkins for automated infrastructure provisioning 2. Created an AWS IAM User and configured Terraform source to automatically grant relevant permissions for AWS Cloud Infrastructure automation 3. Structured modules for network, containers, node_group(linux, window), argocd, argocd_application, management, and database in sequential order 4. Configured environments as dev, production, manage, and user, automated with Jenkins 5. Commonly used network, containers, and node_group modules across all three environments (dev, production, manage) 6. The manage environment additionally utilized the argocd module, while dev and production environments utilized argocd_application, management, and database modules (Module Details) 1. The network module automatically configures VPC, subnet, internet gateway, routing table, and VPC peering 2. The containers module automatically sets up IAM role, policy, and tags related to creating an EKS Kubernetes cluster 3. The node_group module automatically configures IAM role, policy, and tags related to creating node groups for an EKS Kubernetes cluster 4. The argocd module installs argocd using helm_release after creating AWS Load Balancer Controller, issues ACM certificates for HTTPS access to argocd, automatically registers ACM certificates in Route 53 for argocd ingress to ALB, and finally registers argocd domain in Route 53 5. The argocd_application module automatically issues ACM certificates and registers them in Route 53 for application servers, creates application ingress to ALB with the associated ACM certificates, and automatically registers the domain in Route 53 if there are no issues 6. The management module automatically configures roles, policies, etc., for fluent bit to utilize monitoring tools 7. The database module automatically installs AWS RDS and NoSQL resources ● Migration of Vehicle Diagnostic Tool Management System from On-Premises Server to AWS Cloud Kubernetes System (Window-based) -> C#, Powershell -> IIS 10 -> ASP.NET Core 2.0 -> Windows Server 2019 core -> Amazon Elastic Container Registry (AWS ECR) -> Git (AWS CodeCommit) -> Individual development 10. Analyzed existing On-Premises server environment including IIS, ASP.NET, and libraries 11. Configured PowerShell scripts to ensure seamless installation of the above environment in Docker files 12. Established a Windows-specific agent server for Jenkins to enable building of Windows Docker images 13. Configured Jenkins pipeline to use the Windows-specific agent for builds 14. Configured repository (AWS ECR) to store Docker images after successful builds 15. Installed and configured ArgoCD on Kubernetes (AWS EKS) 16. Applied GitOps principles, utilizing Git for managing deployment configuration files (yml) 17. Configured development and production deployment environments in ArgoCD 18. Collected system logs and vehicle diagnostic tool authentication logs in Kubernetes using fluent-bit, and transmitted them to AWS CloudWatch in real-time 19. Set up dashboards and alarms to respond immediately to system failures or issues with vehicle diagnostic tool authentication ● Building the Kubernetes infrastructure and CI/CD pipeline (Linux-based) -> Git (AWS CodeCommit) -> Jenkins (CI), ArgoCD (CD) -> Docker and Kubernetes (AWS EKS) -> Amazon Elastic Container Registry (AWS ECR) -> AWS Simple Notification Service, AWS SQS queue (For Webhook) 1. Installation and configuration of AWS CLI and Kubectl 2. Setting up IAM roles for EKS Cluster and Nodes 3. Creating and configuring AWS EKS Cluster and Nodes (Kubernetes Environment) 4. Installing Jenkins on EC2 (AWS) and configuring it with plugins 5. Creating Jenkinsfile for pipeline in Jenkins for dev server 6. Creating docker files for each service associated with React, Node, and Python within a project 7. Creating a test code verification environment and test code for each service (React, Node, Python) 8. Setting up and linking storage (AWS ECR) to store docker images 9. Installing ArgoCD in Kubernetes (AWS EKS) and configuring it 10. Applying GitOps with Deployment settings file (yml) for each service in Git 11. Configuring development deployment server and production deployment server environment on ArgoCD 12. Setting up Jenkins Auto Build and Deployment Environment when developers commit/push to Git (AWS CodeCommit trigger, AWS Simple Notification Service, AWS SQS queue, Jenkins SQS plugin) ● Vehicle Fault Diagnosis and Resolution Service REST-API Development -> Python 3 -> FastAPI -> MongoDB (AWS DocumentDB) -> AWS Cloud (EC2, DocumentDB) -> Individual development 1. Building a Web Framework with FastAPI 2. Developing an API for storing and retrieving automotive repair information 3. Developing an API for storing and retrieving car status information that requires repair 4. Developing an API for storing and retrieving chat messages ● Development of Products for Accurate Inspection and Diagnosis by Vehicle Repair Shop -> HTML5, SCSS (7-1 Sass Architecture), JavaScript -> Bootstrap, Material-UI -> React, Redux -> Node.js (Express Framework) -> Webpack (Custom) -> 2 person development 1. Addressing limitations in inspection and diagnosis by existing vehicle repair shops through phone or fax responses 2. Developing React web applications based solely on requirements (startup company) 3. Designing layouts consisting of sidebar, top, bottom, and content components 4. Developing components that automatically retrieve vehicle information by scanning the QR code of the vehicle inspection certificate 5. Developing UI components for fault code items detected with the vehicle inspection terminal G-Scan and items to be checked by maintenance engineers 6. Applying material grid for inquiry, search, and filter functions 7. Developing chat components that facilitate communication between repair shops and managers 8. Implementing answers to inquiries in PDF format (FAX) 9. Implementing notification functions for repair shops and support centers when making inquiries and providing answers (WebSocket) (React, Node.js) 10. Implementing notification functions for adding/disabling partners (SMAS, etc.) and repair shop partners (WebSocket) (React, Node.js)
更多
● E-mail System Development and Operation -> JavaScript, jQuery -> Java (1.7), Spring, iBatis, DBMS (Oracle) -> Individual development and operation 1. Collaborating with a team to publish the necessary email templates 2. Designing and registering tables to populate data into email templates required by the marketing team 3. Creating and registering a shell script to parse and store monthly changed customer log data in a designed table 4. Creating and registering JavaScript code in the email web system to retrieve data from monthly updated customer tables and reflect them in templates 5. Registering and executing shell scripts and completed email templates to be automatically sent to the scheduler system every month
更多
● Threat Detection Analysis Platform - Convert and Upgrade Legacy Code to Vue -> HTML5, CSS3, JavaScript (Vue.js), Nuxt (SPA), Vuex -> Devextreme, Kendo, Echart, Bootstrap-Vue -> 5 person development team 1. Leading pilot for new product development with a team of 5 members … 6. Reviewing and adapting legacy UI elements to work with Vue ● Development of DDoS Attack Report RESTful API -> Java (1.8), Spring Boot, JPA, DBMS (PostgreSQL) -> Lucene (7.0) -> Individual development 1. Certification module for Restful API is omitted as it is installed as a product instead of the company's own service 2. Providing five APIs for attack reports: Attack Termination List Report, Attack Termination Detail Report, Attack Traffic Report, Attack Time Report, and Attack Type Report … 5. Ensuring data integrity, reliability, and performance improvements for each report ● Development and Maintenance of a Threat Detection Analysis Platform -> HTML5, CSS3, JavaScript (Echart, D3.js), jQuery, AJAX, plugins (Kendo, jQuery-UI) -> Java (1.8), Spring Boot, JPA, DBMS (PostgreSQL) -> Lucene (7.0) -> 5 person development team 1. Developing UI components for dashboard panels, widgets (pie, list, map, bar graph), and related services (view, add, edit, delete) … 6. Development of rule-based machine learning functions a. Incorporating machine learning capabilities to leverage t-digest, an unsupervised learning model b. Utilizing t-digest attributes (such as CDF, quantile, max, and min) to display anomaly detection marks on the analysis result chart and applying t-digest attributes to the grid … 8. Development of IP automatic blocking response a. Implementing a function to block IPs on each device by making REST API requests to linked devices (e.g., firewall equipment, DDoS equipment, network intrusion prevention equipment) b. Automatically detecting threat IPs based on internal filter rules and executing automatic blocking responses through a scheduler.
更多
● Development and operation of Naver Store Farm API integration system -> Linux, Shell Script, PHP (5.2), MySQL -> Individual development 1. Developed an API system using SOAP communication to address the limitations faced by sellers in individually registering and managing products or utilizing auto and e-sales tools. 2. Obtained in-house server certification for Naver Store Farm API and built an API integration system on related servers. … 4. Defined scheduler jobs (such as new product registration, product modification, additional product registration, product service interruption) within the API integration system, leveraging schedulers, controllers, and services. 5. Automated product registration (10,000 and 300,000), modification, and deletion processes based on the product data managed by OwnerClan, aiming to enhance convenience for sellers. 6. Implemented log file processing to track and validate automated scheduler jobs in case of failures. 7. Scaled the infrastructure by adding more databases and web servers to accommodate the growing popularity of sellers. ● Operation and development of a B2B distribution system -> Linux, jQuery, AJAX, PHP (5.2), MySQL -> Collaborated with the entire team 1. Managed and further developed the membership, product, and payment components of the B2B distribution system. 2. Addressed issues in the vendor (SCM), manager, and seller areas, and developed new pages as required. ● B2B Mobile Web (SPA) and Service Development -> Ionic framework, AngularJS, jQuery, AJAX, PHP (5.2) -> 5 person development team 1. Worked as part of a team of 5, including 3 developers, 1 planner, and 1 designer. 2. Led the development by introducing the Ionic framework and leveraging AngularJS technology to create a hybrid application. … 5. Integrated with the KCP payment module for various payment methods, including bank transfers and card payments.
更多
● [CNM, CJ] Developed LG-SAMSUNG Smart TV app (UHD, FHD) -> HTML5, CSS3, JavaScript, jQuery, AJAX -> 5 person development team 1. Collaborated on UI framework, activation, VOD, search, channel, and setting features, including scheduling and role sharing. 2. Parsed XML data obtained from the SD & S server and transformed it into a JSON tree structure. … 5. Implemented features such as EPG, MINIEPG, and OSD by utilizing channel and program information. 6. Developed a purchasing process for channel products, integrating with the RCP (Remote Control Protocol) system. 7. Implemented configuration functions such as changing authentication numbers, age restrictions, schedule notification settings, channel manual deletion settings, channel limit settings, and system settings. ● KT development (TV-SetTopBox) (TV ALBUM APP) -> HTML5, CSS3, JavaScript, jQuery, AJAX -> Individual development … (Homeportal development) -> HTML5, CSS3, JavaScript, jQuery, AJAX -> 5 person development team … ● Developed Tbroad (TV-SetTopBox) -> HTML5, CSS3, JavaScript, jQuery, AJAX, Java, Spring Framework (3.x) -> Individual development (YouTube - Web App for Appstore) 1. Developed UI and GUI elements inspired by the YouTube homepage. … 4. Incorporated an OSK (On-Screen Keyboard) supported by middleware for search functionality. (Kids Book, Kids English, Weather, Horoscope - Web App and App Store) 1. Converted JAVA emulator-based versions into web apps. … 5. Utilized a web service built with Spring 3.1 instead of directly connecting with JSON files for real-time weather and horoscope data. (Browser UI Development) 1. Collaborated with the browser team to focus on site content and APIs that control those sites, specifically addressing inoperable APIs and additional request APIs. 2. Conducted UI and GUI reviews. …
更多
活動
最近活動
證照 2
專案 11
프로젝트
차량 부품 판매 문의관리 시스템 플랫폼 개발(React)
Intersupport(freelance) · 2021년 9월 - 현재 · 4년 5개월
차량 부품 판매 문의관리 시스템 플랫폼 개발 -> html5, scss, javascript -> react, redux, webpack, bootstrap, material-table -> 1인개발
프로젝트
위협탐지 분석 플랫폼 개발 및 유지보수
안랩 · 2017년 4월 - 2021년 2월 · 3년 11개월
-> html5, css3, javascript(echart, d3.js), jquery, ajax, plugin(kendo, jquery-ui) -> java(1.8), spring boot, JPA, DBMS(postgresql), netty -> lucene(7.0) -> 5인개발(팀)
프로젝트
위협탐지 분석 플랫폼 레거시 코드를 vue로 전환 및 고도화(Vue.js)
2020년 2월 - 2020년 7월 · 6개월
-> html5, css3, javascript(vue.js), nuxt(spa), vuex -> devextream, kendo, echart, bootstrap-vue -> git, sourcetree -> 5인개발(팀)
프로젝트
DDOS 공격보고서 RESTful API 개발
안랩 · 2018년 3월 - 2018년 6월 · 4개월
-> java(1.8), spring boot, JPA, DBMS(postgresql) -> lucene(7.0)
프로젝트
네이버 스토어팜 API 연동시스템 개발 및 운영
1인 개발 · 2016년 3월 - 2017년 4월 · 1년 2개월
-> linux, shell script, php, mysql -> 1인개발
語言
중급 (업무상 의사소통)
중급 (업무상 의사소통)
이 프로필의 담당자이신가요?
인증을 통해 현재 프로필에 병합하거나 삭제할 수 있습니다. 만약 인증할 수 없는 경우 본인임을 증빙하는 서류 제출 후 프로필 관리 권한을 취득할 수 있습니다.