My CLI Project was a wake up call!

Emrich-Michael Perrier
3 min readFeb 24, 2021

I am not going to lie this CLI project was very difficult for me. I was a week behind the program due to not feeling well and I spent all of project week doing the previous lessons. I finally got to the project on Friday and stuggled to get it to the deadline. I wish the circumstances weren’t how they were. I think I actually learned a lot with this project and got comfortable with many techniques. I got more comfortable with OO ruby as a whole and now believe whenever I have to tackle something like this again I should be set. I think that if I can do something like this in a couple days, a week should be easier for me to handle (fingers crossed). I also believe I got more comfortable with scraping and think next time I need to scrape some data using “Nokogiri” it should be cake!

To dive in more about the technical side of things, I would like to explain exaclty what I was struggling with specifically. With OO ruby I struggled with the concept of attr_accessor — setters and getters specifically for awhile. I did not really understand what exactly they did and how to use them, I just new I had to write them in for the attributes in the #initialize method. I see now that we can use the setter to set the instance variable to a value and we use the getter to grab that value from the class. I saw that the attr_accessor was basically creating 2 methods for each attribute, just 1 line of code was doing all that! Further more I want to talk about how I was having trouble with scraping. For awhile I did not know how to find the specific class that the data was being stored in. I guess this was less about using the Nokogiri methods and more about using the “inspect element” tool to find the classes for the data we want. Once I got there I did not know how to use Nokogiri methods, but after reviewing the scraping lab and googling the errors with Nokogiri I figured it out. Specifically I was usinghtml = open(URL) to get the raw HTML from my URL. For some reason it would work but it would still show error messages. To surpasss this I googled around and found out it had to be changed into html = URI.open(URL) to work properly without showing any errors. After that I struggled with "cleaning" up the data so that it would be stored in the arrays correctly but I did this by simply viewing the data and cleaning it with #pop() and #split(). Once this was finished I simply had to get the CLI class to kick things off and use the Scraper and Player classes in the CLI. Because I did not start the project with Bundler I was forced to understand how to setup my enviroment file and it really helped me understand how these folders and files work with eachother. Additionally trying to setup the enviroment helped me understand the use and difference of "require" and "require_relative".

All in all I think that seeing how someone performs in pressuring and stressful situations is good for the soul and character. I would not like to be feeling the amount of anxiety I experience during this again, however I am glad this is something I’m going through because I believe it will make me a better person and hopefully… a better coder!!!!

--

--

Emrich-Michael Perrier

Fulltime Frontend Developer, Anime nerd and lifter of heavy weight in competition