Site Log

2024-09-26 -- JavaFX
Had the idea of trying to create my own NHL scoreboard using Java, JavaFX, and the NHL api. Therefore the NHL predictions rewrite has been paused while I learn more about JavaFX.
2024-09-26 -- Almost done with rewriting the NHL stats backfiller as a quarkus app.
Taken me a lot longer than originally anticipated because life got busy. The backfiller is almost done, it just needs to be tested to make sure that it behaves the same as the golang version.
2024-08-18 -- Quarkus Rewrite complete
Finally finished the rewrite of the website to use the java quarkus framework, instead of Golang. The rewrite wasn't actually too bad. The main hiccups were learning the Qute templating engine syntax, learning where to put templates and static html resources so that they are properly found when creating native executables. Finally, I had to remind myself how to link my Firebase hosting with GCP cloud run to serve dynamic content. I did this once when I setup the golang website and needed to do it again to map requests to the quarkus cloud run service. All in all, it wasn't too bad and I'm looking forward to rewriting the NHL predictions engine in Java with Quarkus.
2024-07-23 -- Quarkus?
Last year I was all about golang, now I'm thinking I want to lean back into Java and explore the quarkiverse. Thinking I may rewrite the website to be a quarkus webapp and then rewrite the nhl stats app as a quarkus application as well before the start of next nhl season. Could be something fun to do to keep me busy and brush up on some new Java technology
2023-12-16 Sat -- restructuring of go code and moving to cloud run
Despite not updating the site log, I have been busy with restructuring the go code for the nhlstats project. I originally moved all of the logic code into an internal folder and then created a cmd folder that housed separate main.go files to either run the backfiller or the predicter utilities. However, recently I figured it didn't make sense to make all of my logic hidden in an internal file and so moved all the logic into their own modules under the root folder and then moving to a single main.go file that will either run the backfiller or the predicter based on a command line flag. On top of this restructuring change I also decided to move away from cronjobs running on my local machine to cloud run jobs that will run in gcp. I now have a script that will build a docker image based on a distroless debian image and then push the image to GCP artifact registry and finally create two cloud run jobs (one to execute the predicter and one to execute the backfiller). Both of which are controlled by cloud run scheduler triggers. I'll be curious to see how much these cloud run jobs cost me over the next month (Last month's GCP bill was only $0.01)
2023-11-09 Thu -- nhl predictions page back online
After two days of rework and learning the new nhl stats api (which can be found at https://api-web.nhle.com/v1) the nhl predictions page is back online. The logic in the playerutil and predictionutil modules has been adjusted to use the new response format from the nhl api. The new data format also required quite a bit of logic changes to get things working correctly. Shout out to https://gitlab.com/dword4/nhlapi for finding and documenting the new endpoints.
2023-11-07 Tue 12:46 -- nhl stats api has been taken down
As of this morning the nhl stats api is no longer reachable. I need to investigate if there is another way to get the data i need. In the meantime I'm taking down the nhl predictions page.
2023-11-06 Mon -- nhl predictions now live
The nhl predictions page has now been live for the past couple of weeks. The nhl predictions are powered by a logistical regression model that was trained on historical nhl data. Both the model and the data are stored in GCP big query. Every morning the model is retrained with yesterday's games. And then every hour starting at noon there is a go program that checks if the today's games lineups have been announced and if so it then uses the GCP model to try and predict the winners of today's games. Right now the prediction and the re training of the model are all run via cron jobs on my local machine but the plan is to move them to GCP cloud run jobs in the near future. At first I was using GCP vertex AI to create and train my logistical regression models but that service seems to incur higher costs than doing the creation and training through GCP big query. Also, how about the Ducks and Red Wings?
2023-10-09 Mon 11:47 -- Update
Not much to report on the site due to the fact that I've been focused on gathering historical nhl data to try and learn some ML concepts using gcp big query and gcp vertex ai. Because of this I've updated the gcp billing costs section of this site with a disclaimer noting that the costs no longer represent the cost of just running the website b/c the majority of the costs reflected there are from running vertex ai training models (which I'm learning can become pretty costly pretty quickly).
2023-09-09 Sat 10:10 -- Added a simple contact info page.
Lately, my attention has been on a major project at work: streamlining our GCP infrastructure with Terraform and setting up CI/CD for our applications. This has kept me quite occupied. Though there haven't been many recent updates on the site, I did add a new Contact page with my email, LinkedIn, and GitHub links for easy connection.
2023-09-05 Tue 09:56 -- Deploying gcp costs to cloud run.
Deploying BigQuery code on Google Cloud Run had its share of challenges. Initially, SSL certificate errors cropped up due to missing ca-certificates in the base Alpine builder image. I resolved this by including the necessary certificates in the image build process. Additionally, fine-tuning service account permissions was essential. After adding the right roles, the BigQuery jobs ran smoothly, and the costs page now functions properly.
2023-09-04 Mon 10:10 -- Setting up billing export
Set up the big query dataset and enabled cloud billing export to the new big query data set so that I can query the data from a Golang gcp big query client and display the data on the website. Added the big query client to the go server and setup the gcp_cost.html template file to display the data from the big query results. Local tests seem to be working fine. Now need to test in cloud run to make sure I have the appropriate permissions set up for the cloud run service to execute big query jobs.
2023-09-03 Sun 17:34 -- Figuring how to retrieve gcp billing costs programmatically
Today was a pretty light day working on the site. Mainly just research on how to get billing cost data programmatically. Looks like I will have to set up continual billing exports to big query and then query the big query table in order to get what I am after for the billing costs html page.
2023-09-02 Sat 09:08 -- Still Reading Tutorials...
Haven't done much with the site for the past few days due to focusing on other things in life. Almost done reading the MDN html tutorial and then I'll move onto the CSS tutorial. In the meantime I've also had the idea to create a billing section on the site to show how much this whole thing is costing me in GCP (Spoiler: so far it's only cost me 3 pennies if you exclude the cost of the domain name that is.). I'm thinking I'll start small with simple stats and then hopefully grow it to a cool graph using D3.js (which will mean more tutorials).
2023-08-27 Sun 16:49 -- The Power (and Fun) of Go Templates
Updated the go server to leverage the power of the standard go library html/template to create the site_log.html page. I now write the site log entries in yaml (each site log entry is an element in an yaml array) and the site_log.html is now a golang html template. At server startup I load up the html template and read in the yaml file and unmarshal it into an appropriately defined go struct using the <a href="https://pkg.go.dev/gopkg.in/yaml.v2" target=_blank>yaml go package</a>. This reduces the manual steps I had to take of moving entries from my emacs org file into an html page. I now just have to write my log entries in the yaml file and then redeploy and the go server takes care of the rest.
2023-08-26 Sat 16:49 -- gcloud cli and Artifact Registry fun
Finally figured out how to delete old artifact registry images using the gcloud cli. Basically gcloud commands allow for formatting and filtering the output using the aptly named flags (--format and --filter). So I run a <code>gcloud artifacts docker images list</code> command with a filter that limits the output to images older than one day and then formats the output to only return the package and version fields. The output of this is stored in a bash array and then looped through and each item is passed to the <code>gcloud artifacts docker images delete</code> command to delete the image. Now I don't have to worry about exceeding the free size limit on gcp artifact registry.
2023-08-25 Fri 19:26 -- Semantics and Aesthetics
Learned more about semantic and non semantic tags and tried to organize my site using logical semantic tags. Also updated the color to something more aesthetic (according to chatgpt) and updated the powered by logos to match the new background color. Finished the html tutorial on the MDN site. Next up is their multimedia tutorial.
2023-08-23 Wed 07:47 -- Powered By
Added a powered by section that includes svg's of each product I am using and links to their home pages. One interesting thing is that img tags line up horizontally but things like heading tags and paragraph tags line up vertically on a page. Hoping the layout portion of the MDN tutorial sheds some additional light on as to why. Also fixed up some mobile usability issues as pointed out by google search console. Namely text being too small and not having a viewport meta tag for mobile device compatibility.
2023-08-22 Tue 08:58 -- Cloud Run, SVG's, Organization, and Docker cached layers
Updated the go server to read from $PORT environment variable b/c it is injected by the google cloud run when launching the container. Moved all the html, css, and svg images to a public folder for organization. However, since doing that the svg file does not load on the site when going to tcourdy.com. However, when using the firebase url's or the google cloud run url the svg image loads just fine. Really scratching my head over this one. *Update: Turns out it must've been something with the docker image. A redeploy and the svg is working as expected now* (I suspect an docker image layer was cached)
2023-08-21 Mon 12:33 -- Decrease Load Times
Updated the html to remove pulling in the material icons and css for quicker page load time.
2023-08-20 Sun 09:33 -- General Housekeeping
Created a new service account with minimal cloud run permissions to adhere to google's security best practices. Updated the go server to just serve static files for now. No sense in pulling in the html template library if there is no dynamic content.
2023-08-13 Sun 12:22 -- Firebase and Domain Registration
Setup a firebase hosting account and registered the domain tcourdy.com via google domains. Then routed traffic from custom domain to google cloud run application serving static welcome page for website. Also wrote a little script that facilitates building the docker image, pushing to gcp artifactory, and redeploying the gcp cloud run service.
2023-08-13 Sun 12:20 -- The Beginning
Setup my google cloud platform account and wrote a simple hello world web server using Go standard libraries. Deployed to cloud run via a docker image that is stored in google's artifact registry. The whole process was really easy to get up and running due to the fact that I have some experience with google cloud run via my place of employment.