Skip to main content

Calculators Didn’t Kill Math: Why You Must Know Linux To Master the Cloud

February 3, 2020February 6th, 2020Announcements

Chances are good that at some point on your way to graduating high school you complained bitterly about needing to master the fundamentals of math when you had a perfectly good calculator. Who cares? I’ll NEVER use this? I might as well learn the abacus! One of those probably rings a bell. Somewhere along the way, you’ve likely decided that your teachers (and parents) were right and that despite calculators there are many ways, big and small, in which you regularly put your math knowledge to use. 

Just as having a calculator in hand doesn’t prevent the need to know math, using hosted cloud resources (AWS EC2, Azure, etc…) doesn’t prevent you from needing to know the underlying operating system (which is Linux for over 98% of EC2 and over 60% of Azure instances). Sure that EC2 or Azure server is cheap, flexible and easy to use – that’s why so much computing has shifted to the cloud. However, the magic of the web-based GUI can only take you so far. If you’re going to be a Cloud professional, you need to learn the basics. You need to learn Linux. 

The support documentation from the providers themselves demostrates the importance of understanding the Linux fundamentals. Here’s AWS on fixing performance issuesusing SSL and updating/installing software. Or check out Azure’s take on Troubleshooting a boot errorresetting a password or checking your SSH connection.

To illustrate this point further, our senior instructors share real-life examples where they have solved a problem or found an easier solution directly from the Linux command line. 

Converting Video 

4K and UHD video content is becoming the standard, including from newer cell phones.  When recently working on a project I found my older Linux workstation couldn’t smoothly playback the high-resolution content. With a lot of content to review I needed easier to digest video, sometimes called a proxy. The editor had no ability to convert. Luckily the ffmpeg command has lots of options for trans-coding, with the ability to change almost every aspect of the video.  Using a terminal, a for loop, and ffmpeg I converted from proprietary to open as well as much lower resolution videos.

for name in P68.MOV P74.MOV P76.MOV P77.MOV P78.MOV;
do ffmpeg -i $name -vf scale=-1:270 -c:v libx264 -profile:v high422 -crf 12 -intra
-tune film -preset veryfast -c:a aac -b:a 384k  ~/Desktop/$name.mp4;
done

After thirty minutes all the lower-resolution proxy files were available and I was able to view and edit them without issue. Once all my cuts had been made I replaced the proxy clips with the higher-resolution originals and rendered out the final project.

-Tim Serewicz

Automate a Process

I maintain a number of embedded courses for the Linux Foundation which coincide with the Linux kernel release which is roughly every 2 months. Coupled with that the dozen or so other codebases that are similarly used in these trainings. For any particular update, at least half of the codebases have been updated since the last update of the course. With so much churn which affects the content in the course, I was spending sometimes weeks just manually building, testing and recreating all the output, listings and boot sequences from embedded hardware every couple of months. This is time-consuming and error-prone process.

However, scripting allows you to automate any process which is repetitive and otherwise mechanical. If you can see a pattern while doing something manually, you can teach a computer to do it for you. Test building all the labs? Drive makefiles from bash scripts, and grab artifacts from the output. Test running the labs on embedded hardware and collecting sample output of things running correctly? expect scripts to the rescue. Graphing changes to code bases and changes over time? gnuplot and Graphviz do things very nicely. Including sample code in the courseware? grep, sed and awk can do this quite easily. And wkhtmltoimage can capture webpages (even ones rendered with javascript) from the command line and spit out a PNG file.

With the assets for the class built using automated processes, more time can be spent updating the course text in latex (which of course is built with make and stored in git). There is honestly no possible way I could ever keep up with this workload if I was forced to manually use graphical tools to do these jobs.

It looks like Linus is about to release a new kernel, which means this process needs to start all over again.

-Behan Webster

Monitoring a Server

I often find myself performing repetitive daily, weekly or even monthly tasks that could be easily automated. From downloading enrollment lists from the cloud buckets, compiling expense reports, to somewhat more sophisticated infrastructure monitoring tools, I find myself in a position where I could leverage Linux scripting to get some jobs done for me, rather than by me, repetitively.

This one time, I was tasked to monitor the activities of a critical transaction processing server, retrieve its logged errors, and then run various commands to correct those errors. This became one lucky person’s task because the dedicated enterprise monitoring tool did not play nicely with the transaction processing server – they would constantly crash each other. However, what I had expected to be a once in a blue-moon request, became a daily, mundane, monitoring, finding errors, then fixing errors – type scenario. After a while, once I had the chance to run through every possible error scenario, I realized that the steps’ ordering never changed and that the only variables were error codes and server agent process names, with very few exceptions. So, I immediately started working on automating the entire process through a script. I had wasted enough time away from my coworkers, who were enjoying friendly chats over a good cup of coffee 🙁

While the Linux tools I used were everyday tools, it took some time to complete the entire automation script. I was not completely familiar with all the tools’ options and expected syntax, so I had to research them and test them before implementing them in my script. In essence, I used Linux commands such as date, tail, grep, egrep and cut to parse through daily log files. Then while loops, if-then-else, and case to check and validate error codes and, to execute agent process restarts, elevated access was achieved with sudo. All the script’s activities were logged separately via echo to produce a summary of errors and restarted agents to be distributed via email from the mailx tool. Since client transactions were initiated twice daily, I used another tool, crontab, to have my script run on a schedule twice a day, five days a week.

With this script in place, I was finally able to dedicate more of my time to essential daily tasks – friendly chats with coworkers over some good coffee, daily, twice a day – this shall never be automated 🙂 

-Chris Pokorni 

Where did I put that file?

In everyday use, many files are created: documents, spreadsheets, photos, etc. The trick is where did I put the file. My first thought when looking for a file is just to ask “find” to locate it. Something like “find. -type f -uname invoice\*” and that should return a list of all files that begin with “invoice”. To expand that a little try “find . -type f -uname \*invoice\*” the extra “\*” in front of the name will help find <sometext>invoice<sometext> starting from the current directory.

The output from our “find” command might be very large so we can narrow down the output with the user-friendly “grep” command and a pipe, something like this: find . -type f -uname \*invoice\* | grep 2020. This would cut the list of invoices to those with 2020 in the name.

Once we start using “find” and “grep” we can locate all sorts of things. I take underwater photos and have about 10 years of photos, this numbers in the hundreds of thousands of pictures living on my storage array. With the aid of a program that peeks into the photo metadata, “exiftool” , the photo creation date can be extracted. With the help of these tools, I can locate the photos taken between two dates and create a “dive trip” photo list.

-Lee Elston, 

Package Managers and Configuration Files

In a normal day, I use several different distributions of Linux, today is CentOS8, Fedora31 and

Ubuntu19.10. The requirement today is to document the instructions to install an application across all of the distributions. Since the software is all OpenSource it should function the same regardless of which distribution is used, but here is the challenge: the distributions are free to locate the files on the system anywhere they want. Sometimes the file is in the “/var/lib” subdirectory or maybe the “/usr/var/lib” directory. If I had a photographic memory I would remember the differences, but so many files. My fast and easy solution, ask the package manager (dpkg,rpm) to list where the files for the installed package are. This solves the problem quickly and easily.

-Lee Elston

There is no questioning that calculators changed the way we do math, likewise, cloud technology has changed the way we compute. But underpinning both of these revolutionary developments is the fundamental knowledge needed to optimize their use. Math powers the calculator and Linux powers the cloud.

Thank you for your interest in Linux Foundation training and certification. We think we can better serve you from our China Training site. To access this site please click below.

感谢您对Linux Foundation培训的关注。为了更好地为您服务,我们将您重定向到中国培训网站。 我们期待帮助您实现在中国区内所有类型的开源培训目标。