Hardware exploration on Google Colab and Kaggle platforms

Privalov Vladimir
3 min readJul 19, 2019

--

For experimenting with Machine Learning we need appropriate computation capabilities. It’s commonly difficult to afford yourself a desktop PC or laptop with proper HW capabilities due its price. Good alternative is to use free-to-use public cloud platforms like Google Colab or Kaggle.

I have been using these platforms for some time without thinking about their internals. Some time ago I came to an idea to explore what computation power resides under the hood of these platforms so I can have a better picture of that. Here are my findings.

CPU

You can get information about CPUs used using command:

!cat /proc/cpuinfo

Here is output for Google Colab

processor : 0
vendor_id : GenuineIntel
cpu family : 6
model : 63
model name : Intel(R) Xeon(R) CPU @ 2.30GHz
stepping : 0
microcode : 0x1
cpu MHz : 2300.000
cache size : 46080 KB
physical id : 0
siblings : 2
core id : 0
cpu cores : 1
apicid : 0
initial apicid : 0
fpu : yes
fpu_exception : yes
cpuid level : 13
wp : yes
flags : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ss ht syscall nx pdpe1gb rdtscp lm constant_tsc rep_good nopl xtopology nonstop_tsc cpuid tsc_known_freq pni pclmulqdq ssse3 fma cx16 pcid sse4_1 sse4_2 x2apic movbe popcnt aes xsave avx f16c rdrand hypervisor lahf_lm abm invpcid_single pti ssbd ibrs ibpb stibp fsgsbase tsc_adjust bmi1 avx2 smep bmi2 erms invpcid xsaveopt arat arch_capabilities
bugs : cpu_meltdown spectre_v1 spectre_v2 spec_store_bypass l1tf
bogomips : 4600.00
clflush size : 64
cache_alignment : 64
address sizes : 46 bits physical, 48 bits virtual
power management:

To economy space I have shown only specification for one CPU. There are two CPUs @ 2.2GHz available.

Here is output for Kaggle:

How much disk space is available. Check it:

!df -h

Output on Google Colab — 49GB in total. 220 GB is totally available on Kaggle.

RAM amount:

!cat /proc/meminfo | egrep “^MemTotal”

On Google Colab you have 13 GB of disk space, on Kaggle 26 GB.

There is one more difference between Google Colab and Kaggle. When using Kaggle kernel no internet connection available thus thinks like that will fail

from keras.datasets import mnist

Networking details

On Kaggle you can create a kernel either tied to a specific competition or a dataset. On Google Colab you are not limited in this sense.

GPU

Let’s now check GPU information.

On Google Colab:

from tensorflow.python.client import device_lib
device_lib.list_local_devices()

Output

physical_device_desc: "device: 0, name: Tesla K80, pci bus id: 0000:00:04.0, compute capability: 3.7"]

Google Colab uses Nvidia Tesla K80 GPU. Kaggle also uses Nvidia Tesla K80.

Now you have more knowledge on the Kaggle and Google Colab platforms. Enjoy experimenting with Machine Learning on Kaggle and Google Colab.

--

--

Responses (1)