Difference between revisions of "CV - Work experience"
Adelo Vieira (talk | contribs) |
Adelo Vieira (talk | contribs) |
||
Line 36: | Line 36: | ||
'''[https://www.idg.com/idgdirect/ IDG Direct], Ireland''' | '''[https://www.idg.com/idgdirect/ IDG Direct], Ireland''' | ||
</p> | </p> | ||
− | ''' | + | '''Real Time Data Analyst''' |
</div> | </div> | ||
<!-- * This is a pre-sales role. I represent IDG services by making professional outgoing calls to prospective clients. I have to establish and maintain a professional conversation with IT Managers to identify their needs and next investments. The gathered information is required from our clients (Largest Tech Companies) and used in the next step of the sales process. --> | <!-- * This is a pre-sales role. I represent IDG services by making professional outgoing calls to prospective clients. I have to establish and maintain a professional conversation with IT Managers to identify their needs and next investments. The gathered information is required from our clients (Largest Tech Companies) and used in the next step of the sales process. --> | ||
− | + | I'm responsible for analyzing and monitoring call center data. This includes call volumes, average handle times, queue | |
− | + | time, agents availability, performance indicators, inactivity levels, etc. | |
− | + | * Finding patterns and trends in the analyzed data to help increase productivity and forecast requirements. | |
− | + | * Generate ideas for process and service improvement. | |
− | * | + | * Produce daily, weekly and monthly internal reports to assist with creation of metrics and targets for services. |
− | |||
− | |||
− | |||
− | * | + | * Work closely with the operations team to analyze and help improve their delivery processes. |
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
− | |||
<!--end-----------------------------------------------------------------------------> | <!--end-----------------------------------------------------------------------------> | ||
|- | |- |
Revision as of 15:17, 26 June 2022
Present ↑ 2017 |
IDG Direct, Ireland Real Time Data Analyst I'm responsible for analyzing and monitoring call center data. This includes call volumes, average handle times, queue time, agents availability, performance indicators, inactivity levels, etc.
|
Present ↑ 2017 |
IDG Direct, Ireland Senior Business Development Executive
Communication and Sale Skills
Target and KPI
|
2014 |
WikiVox, France Web Programmer
WikiVox is a nonprofit organization whose goal is to create a website (a wiki) for debates of political, economic and environmental topics. They want to create a discussion method capable to generate, at some point in the debate, an article with precise suggestions, in order to contribute to the solution to the problem. When I was working at WikiVox, the project was just starting. The philosophy of the project was already mature, but the implementation of the Wiki was just in its first phase. It was a very nice experience. I liked very much especially the philosophy of the project. And... I think that working in a small organization was positive at this point in my career Because I had responsibilities that I am sure I would not have had in a big company; that's why I think that I learned a lot from them. I had responsibilities related to (1) the administration of a Linux Web Server and (2) to the design of the website.
Wiki - Organize information into a cohesive, searchable and maintainable system.
|
2012 ↑ 2011 |
Simón Bolívar University - Funindes USB, Venezuela Research geophysicist of the Parallel and Distributed Systems Group (GRyDs) Click here to see some examples of my work in Seismic modelling.
Task automation using Shell scripting: Here I could mention the generation of images to create seismic waves propagation videos or the automatic generation of pdf reports using latex that contained details about the executed process: time vs. the features of the data generated (the amount of data generated).
I have skills in Matlab, Scilab and Shell scripting that I got during my participation in an R&D Unit at Simón Bolívar University (The Parallel and Distributed Systems Group - GryDs). MATLAB (matrix laboratory) is a language and numerical computing environment. MATLAB allows data analysis and data visualization, matrix manipulations, and performing numerical computations. Matlab contains a huge library of functions that facilitate the resolution of many mathematical and engineering problems. For example, I used it for Signal Analysis, specifically for Seismic data analysis. it for Ex. 1 and Ex. 2:
|
2011 ↑ 2010 |
CGGVeritas, Venezuela Seismic data processing analyst I was responsible for performing a set of Signal analysis/Time-series analysis/Data processing tasks for oil and gas exploration.
|
2010 ↑ 2008 |
Simón Bolívar University, Venezuela Academic Assistant - Earth Sciences Department
I have three years of experience as an academic assistant in the courses of Seismic Processing, Seismic Reservoir Characterization, and Seismic Methods. During my experience as an academic assistant, I have solidified my knowledge of the theoretical basis of seismic processing. In particular, all the technical concepts that are required for this position, such as Seismic velocity analysis, Multiples, Surface statistics correction, Noise attenuation, and Imaging. During my experience as a teacher assistant, I was assigned three times to teach the Seismic data processing course. My work was to give theoretical and practical lessons. The theoretical part was focused on signal theory: Concepts of discrete signal analysis, sampling, aliasing, and discrete Fourier transform, and all the theoretical aspects of each stage of a conventional seismic processing sequence. And in the practical part, the students had to process a 2D seismic data set. We used the Seismic Unix software. It's a free software developed for the Colorado School of Mines. I was the assistant of the teacher in charge. But I was responsible for a large part of the course since I have participated three times in this course. |