Sign in

Evan Lin

The Problem:

  • The transportation industry makes up 29% of CO2 contribution.

What do these two industries have in common? They both use steel, a material that contributes two tonnes of carbon emissions for every one tonne produced. So is there other alternatives?

Turns out there is. Carbon fibre is a material that is up to ten times stronger than steel, fives time lighter, and which not only contributes less to CO2 — but is also carbon negative.


Learning curves galore, and I’m hungry to learn more!

Learning backend for the first time was a b*tch. I thought I only needed a couple of frameworks and libraries to get the job done in a week but boy was I wrong. Allow me to explain.

Motivation 💡

Before I start, I’d like to give myself a brief introduction. Hi! My name is Evan and I’m a 16-year-old innovator at The Knowledge Society. A couple of months ago, I began my journey in AI from building machine learning models using simple regressors and then calling it a day.

That sh*t was boring. These models didn’t have much interactivity with them. Nobody…


Taking a twist to the classical good ‘ol MNIST Digit project

Watch the first 1 minute and 20 seconds of the video before reading further into this article.

I explain my motivation for building this project and my approaches to learning my prerequisites.

I’m assuming by now that since you have reached here, you’ve likely finished watching up until the timestamp above. If you’ve watched beyond the 1 minute 20 seconds, you may or may not have heard me say that the video is solely meant for showcasing the demo of the code. So let me cut to the chase. …


“Am I on the right path? Am I really learning?”

Things might only start making sense later.

“Am I on the right path? Am I really learning?” This was a question that I typically would ask myself in the middle of researching content or in the middle of building a project.

Remember this word. Middle.

2 months ago, when I started learning machine learning. I had absolutely no clue what I was doing. I was copy and pasting code, and watching playlist over playlist of machine learning tutorials, but I never completed a single one. Only up to the middle of one at most.

This is where…


The second part to the quick and fluff-free introduction to neural networks in machine learning, and how to build your first neural network model! 😄

Alright, so let’s review these key terms from the last article:

Feature: The inputs to our machine

Target data: The actual answer

Labels: The output predicted by the machine — the machine’s guess for the actual answer

Epoch: A full iteration of learning

Stochastic: Random

Addressing concerns:

“But wait will I need any prior programming knowledge?”

Not at all. It’s certainly helpful to have a background in programming to understand typing conventions, but the intuition I will try explaining will be the same.

“What if my computer isn’t powerful enough?”


A quick and fluff-free introduction to neural networks in machine learning, and how to build your first neural network model! 😄

Caveat:

Note: If you’re new to deep learning and this is your first article, or you’re expecting to really understand this article, expect to spend 3-4x more reading time than what Medium says. I’d also recommend you to grab a pencil and paper while following along. If not, then maybe read it out loud.

My Intention:

I want this article to help build your intuition in understanding how and for what reason (why) neural networks work. My goal is to help fill in any nooks and crannies of uncertainties that may arise in your introduction to deep learning.

Let’s get started!

Alright, let's make sure we understand these three things first:

  1. MINDSET to learn!


A brief summary of what to expect in the near future for Quantum Computing…

If you are completely new to quantum computing, I’d recommend you to read this article first before continuing: https://medium.com/@evxxan/the-art-of-replicating-nature-itself-5df6cfb8e541

If I came out of a time machine and told you that we are reliving the computational era of the 1950s, wouldn’t you think I’m crazy? If you said either yes or no, you’d be right AND wrong at the same time. (Superposition pun intended)

Alright, let me clear up this confusion.

In the 1950s, the birth of the first classical computer brought the attention of many…


Taking Quantum Computing with a grain of salt ..or better yet, a bean of coffee…

Figure 1: A pile of coffee beans

Ah yes, the classic Monday morning where we all do our weekly rituals of groggily sulking into our kitchen to make a cup of coffee. If you’re as familiar with this little event as me, you’d then likely commute to your 9–5 or school, and wait for that 250mL of coffee to “kick-in” before you start typing away on a computer. In my case, this cup of coffee is helping me with writer’s block.

We’re all not much different than the conventional computer, in fact…


Face recognition, smoke detectors, automatic doors… These are all examples of sensors that we interact with in our day to day lives. We know they exist because we can see them, built into gadgets, or when we look up to see an overhead scanning our presence. Lets say that one day, all these familiar devices would be shrunken down to an unimaginable form that can not be seen with the naked eye. Lets also say that these little devices will become the conventional way to monitor your health, measuring at the second. …

Evan Lin

Innovator at The Knowledge Society (TKS). Interested in Machine Learning and Quantum Computing.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store