Ethereum Mining

Ethereum mining benchmark with 1x Nvidia GTX 1050 TI & Ubuntu 16.04.2

Hashrate: 11MH/s – 12MH/s 

Post update: These hash rates were based on an old version of Claymore (9.2). The latest version of Claymore will yield the highest hash rates possible.

Hello again – recently I benchmarked my GTX 1080 that gave me around 18-21MH/s for mining ethereum (Ethereum mining benchmark with 1x Nvidia GTX 1080 & Ubuntu 16.04.2).

I’m not going to go into the ins and outs of talking about alternatives to the AMD RX 470/570/580 cards just a quick post about what my 1050 Ti does with factory settings.

Gigabyte Nvidia GTX 1050Ti 4GB GDDR5 PCI-E

Gigabyte Nvidia GTX 1050Ti 4GB GDDR5 PCI-E

This particular Gigabyte 1050 Ti only has 1 cooling fan compared to some others I’ve seen. It’s also by far the cheapest variant out there.

I get around 11 to 12MH/s mining ethereum. I’ve tried a combination of the official ethminer (not Genoils) and Claymore dual miner. Both give similar results. I’m using opencl, not cuda (can’t get cuda to work and I don’t have patience to figure out why at the moment).


Wow this little thing gets hot (in my subjective experience) – as in I’ve put my hand around my GTX 1080 and felt a ‘bit’ of heat. When you put your hand around (not on) this little puppy you can really feel the heat. Apparently though this is okay.

noob-miner@dev-pc1:~$ nvidia-smi -q -d temperature

==============NVSMI LOG==============

Timestamp                           : Sat Jun 17 17:13:40 2017
Driver Version                      : 375.66

Attached GPUs                       : 1
GPU 0000:01:00.0
        GPU Current Temp            : 67 C
        GPU Shutdown Temp           : 102 C
        GPU Slowdown Temp           : 99 C

Power consumption

According power consumption @ the plug, it showed 135W. After updating my nvidia drivers, it’s settled at around 125W.

That is total system power.

However If I’m not mining, the power consumption @ the plug is 60W. (That pc is an old Athlon X2 CPU and 1x 10K/rpm disk drive).

So it’s safe to say it consumes 65W when mining, probably less considering ethminer has spawned a few threads causing both CPU cores to run @ 100%

This guy on YouTube gets 14MH/s:

Admittedly it’s a better 1050 Ti and overclocked



  1. Hello Guys

    is it possible to dig the Ethereum with GTX 1050?? Or it is required to have the ‘Ti’ version: GTX 1050 Ti ??

    Any info about the difference in performance?

    1. Pretty much but since I was connecting to a pool i used -F and specified opencl since I couldn’t get CUDA to work for some reason:

      ethminer -G -F --opencl --opencl-device 0

  2. It’s worth mentioning using ethminer and a shite CPU won’t cut it. You’ll get “typewriter” speed screen updates – well for me anyway, in my other rig which has a Celeron G3900 can’t run ethminer lol – Claymore is a good alternative if you have a crappy CPU like me.

    I’ve read the Pentiums or better don’t suffer from this.. I’ll post a video some time this week.

  3. I’m up and running with the latest CUDA drivers on Ubuntu 16.04. I had to back down to the generic kernel not the LTS. Anything above 4.4 doesn’t build. I, too, am seeing 12Mh/s with OpenCL but also not getting any different speeds with CUDA. Not sure why honestly. I have the 1050 ti which has 4GB ram as well.

    1. Hello Matt,

      Hmm sounds like you’ll probably need to start overclocking if you aren’t already. I had to overclock my 1050 Ti’s to get anything over 12MH/s. 13MH/s is achievable by overclocking memory alone.

      Also bear in mind motherboard chipset will have a big impact. I.e. I had an Intel z270 and z87 chipset and now the Intel h110 – and no surprise the motherboard with the Intel h110 gives me least hashing power.



Leave a Reply

Your email address will not be published. Required fields are marked *