I am using a Windows desktop and I run Linux as a Virtualbox guest. ESP32 development under Windows is super easy to set up – you only need to install the Arduino IDE. Unfortunately, it really bugged me that compilation time is so slow. I’m not enough experienced and test a lot on the real hardware, and slow compilation really slows down the entire development process.
The Arduino IDE v2 has improved and supports parallel compilation within a submodule, but still it works slower than expected on my computer and is not ideally parallelized. Additionally, I noticed that some libraries are recompiled every time which is a huge waste of time (and resources) because libraries don’t change. Only my sketch source code changes.
I decided to see how compilation performs on Linux. The Arduino project has a command-line tool to manage, compile and upload code. The tool is still in active development but works flawlessly for me.
Here is how I installed it on my Linux desktop:
wget https://downloads.arduino.cc/arduino-cli/arduino-cli_latest_Linux_64bit.tar.gz
tar -xf arduino-cli_latest_Linux_64bit.tar.gz
mv arduino-cli ~/bin/
rm arduino-cli_latest_Linux_64bit.tar.gz
arduino-cli config init
vi /home/famzah/.arduino15/arduino-cli.yaml
# board_manager -> additional_urls -> add "https://raw.githubusercontent.com/espressif/arduino-esp32/gh-pages/package_esp32_index.json"
# library -> enable_unsafe_install -> set to "true" # so that we can install .zip libraries
arduino-cli core update-index
arduino-cli core install esp32:esp32
arduino-cli board listall # you must see lots of "esp32" boards now
Here is how to compile a sketch and upload it to an ESP32 board:
I have created a small script to automate this task which also uses a dynamically created temporary folder, in order to avoid race conditions and artefacts. The sketch folder on my Windows host is shared (read-only) with my Linux guest. I still write the sketch source code on my Windows machine using the Arduino IDE.
The ESP32 board is connected to my Windows host but I need to communicate with it from my Linux guest. This is done easily in Virtualbox using USB filters. A USB filter allows you to do a direct pass-through of the ESP32 serial port. It also works if you plug in the ESP32 board dynamically while the Linux machine is already running.
Here is a quick recap of my short experience with Oculus Quest:
It’s a wonderful device to use without a computer. The immersion is incredible, the controllers are easy to use, the interface is easy and intuitive, and it’s comfortable to wear even for my 8-year old daughter who played Beat Saber a few times.
Resolution is much better than the previous generation of VR headsets. There is noticeable flickering in bright scenes. The overall brightness of the screen is high and I had to lower it from the settings of the PC games.
The Oculus Link works with my laptop Dell G5 15 (5587) which has an NVIDIA GeForce GTX 1060M (Max-Q design) video card. This card is listed as not supported but I saw no problems whatsoever. I had to install the latest video drivers and updated all other drivers of my laptop.
For the Oculus Link connection I used a $14 USB-A to USB-C cable with 2.5m length manufactured by Vivanco. The cable test by the Oculus app resulted in 1.7 Gbps transfer via USB 3. The original USB-C to USB-C cable that came with the Oculus Quest resulted in a USB 2 connection with about 0.3 Gbps transfer rate and I couldn’t get the Oculus Link to work with it.
The battery life when used with a computer via the Oculus Link is practically unlimited because the device is simultaneously charged while being used. The same applies if you just use the cable to charge the device.
I had no problems playing the Oculus Rift compatible game Dirt Rally, and I had no problems playing Euro Truck Simulator 2 via SteamVR. I’m pleasantly surprised how well SteamVR and my already installed games via Steam integrated with Oculus Quest via the Oculus Link.
You need a powerful CPU and video card to play the games with highest graphic details and with the highest FPS. With my Dell G5 15 I had to downgrade the graphic mode to lower settings, in order to be able to sustain 36 FPS. If I wanted the full 72 FPS, then I had to run the graphics in the lowest detail.
I couldn’t get Firefox to play 3D 360° content via the Oculus Link.
Uploading a 3D movie to Oculus Quest via the USB cable is very fast. The Gallery app lets you easily play the movie.
Playing car racing simulators is absolutely more realistic in VR mode! Very enjoyable! It’s actually a bit too much realistic because I got motion sickness after a dozen of seconds. I overcame this problem by shaking my head a bit while driving like you’d do if you were driving on uneven road. Therefore, I suspect that having a seat motion platform will eliminate my motion sickness entirely.
Field of view (FOV) while driving a car sim game is enough. I guess it’s the same if you wear a helmet. I’m used to driving go karts with a helmet.
I didn’t test many Oculus VR apps but they seem promising – interactive 3D movies, many games, landmark tours, etc.
My WiFi router wasn’t discovered until I disabled WiFi channels 12 and 13. Then I could pair and setup the Oculus Quest easily using my phone.
My final verdict – Oculus Quest is an awesome product and you will definitely enjoy VR! Using it without a PC in standalone mode is easy and there are enough games and apps to enjoy. Using it with a PC via the Oculus Link requires a powerful PC to play in the highest graphics details but in lower graphics details works flawlessly, too.
After all I returned the Oculus Quest headset because of the following reasons:
I bought it mainly to play games on my laptop but it’s not powerful enough to support the highest graphic details in VR like it does on a monitor. If I’m about to buy a gaming rig, then maybe I’ll opt in for Oculus Rift S (or similar) because it was designed to be connected to a PC while the Oculus Link could introduce problems (compatibility, video compression artefacts, etc).
I was hoping for a slightly better resolution. It’s not that resolution is noticeably bad now. Compared to the first gen VR headsets, it’s much better now. But I think I’ll just wait another gen of VR hardware. Or maybe I’ll try the Pimax 8K VR headset.
The noticeable flickering in bright scenes and the slight motion sickness in games could be caused by the relatively low refresh rate of 72 Hz.
I changed my old laptop with the incredible Dell G5 5587 because I wanted a faster system. This new gaming laptop doesn’t disappoint and is very fast, even suitable for modern games. But… the moment you unplug the power adapter, the system becomes very slow. Sometimes the CPUs won’t go over 800 MHz, sometimes they will for a while but the overall experience is very sluggish.
What I tried to fix this on my Windows 10? Changed the power plan to “High performance”. Customized the power plan and configured maximum performance for the Intel graphics card and the Processor power management. Updated the BIOS. Disabled Battery Boost™ in NVIDIA GeForce Experience control panel. I even spoke with Dell technical support. Nothing helped!
UPDATE: I uninstalled “Intel Dynamic Platform and Thermal Framework” and this seems to have finally fixed it! I’ve been running the laptop on battery a few times now and its performance is finally acceptable.
Battery life is the same after this change. It takes 2:15h to discharge the battery to 10% with office work (browser, mail client, Excel). Screen brightness was at 40%.
Here is my success story on how to compile and flash your own firmware for NodeMCU running on the ESP32 microcontroller. Although there is official documentation on how to achieve this, there were some minor issues that I stumbled across.
Using the Docker image is the recommended and most easy way to proceed. I didn’t have a Docker server so I started a cheap VPS from Digital Ocean. They have a one-click installer for Docker. Once the VPS machine is running, execute the following which covers the step “Clone the NodeMCU firmware repository” from the docs:
### log in via SSH to the Docker host machine
cd /opt
git clone --recurse-submodules https://github.com/nodemcu/nodemcu-firmware.git
cd nodemcu-firmware
git checkout dev-esp32
git submodule update --recursive
# https://github.com/marcelstoer/docker-nodemcu-build/issues/58
git submodule update --init
Then follow the instructions for the step “Build for ESP32” from the docs:
docker run --rm -ti -v `pwd`:/opt/nodemcu-firmware marcelstoer/nodemcu-build configure-esp32
docker run --rm -ti -v `pwd`:/opt/nodemcu-firmware marcelstoer/nodemcu-build build
Note: If you use the Heltec ESP32 WiFi Kit 32 with OLED, when configuring the firmware options you have to change the Main XTAL frequency from the default 40 Mhz to either “Autodetect” or 26 MHz, as explained here. I selected “Autodetect” (26 MHz) under “Component config” -> “ESP32-Specific” -> “Main XTAL frequency”.
For the impatient — most probably the only configuration you need to do is in the section “Component config” -> “NodeMCU modules”.
I encountered a compilation error which was caused by a defined-but-not-used function:
make[2]: Entering directory '/opt/nodemcu-firmware/build/modules'
CC build/modules/mqtt.o
CC build/modules/ucg.o
/opt/nodemcu-firmware/components/modules/ucg.c:598:12: error: 'ldisplay_hw_spi' defined but not used [-Werror=unused-function]
static int ldisplay_hw_spi( lua_State *L, ucg_dev_fnptr device, ucg_dev_fnptr extension )
^
cc1: some warnings being treated as errors
/opt/nodemcu-firmware/sdk/esp32-esp-idf/make/component_wrapper.mk:285: recipe for target 'ucg.o' failed
make[2]: *** [ucg.o] Error 1
make[2]: Leaving directory '/opt/nodemcu-firmware/build/modules'
/opt/nodemcu-firmware/sdk/esp32-esp-idf/make/project.mk:505: recipe for target 'component-modules-build' failed
make[1]: *** [component-modules-build] Error 2
You can simply comment out the function in the firmware sources or configure to ignore the warning. Start a shell into the Docker image and edit the sources:
docker run --rm -ti -v `pwd`:/opt/nodemcu-firmware marcelstoer/nodemcu-build bash
# you are in the Docker image now; fix the source code
vi /opt/nodemcu-firmware/components/modules/ucg.c # add the following (including the starting shebang sign):
#pragma GCC diagnostic ignored "-Wunused-function"
exit
# you are in the Docker host; redo the build as usual
docker run --rm -ti -v `pwd`:/opt/nodemcu-firmware marcelstoer/nodemcu-build build
Because it took me a few attempts to configure everything properly and to finally build the image, I used the following alternative commands to make the process more easy:
docker run --rm -ti -v `pwd`:/opt/nodemcu-firmware marcelstoer/nodemcu-build bash
# you are in the Docker image now
cd /opt/nodemcu-firmware
make menuconfig
make
Once the build completes it produces three files on the Docker host:
Copy those three files on your local development machine where you are about to flash the NodeMCU controller. My first attempt was rather naive and the chip wouldn’t boot. It issued the following error on the USB console:
Early flash read error: "flash read err, 1000"
It turned out that I accidentally overwrote the bootloader and the partitions. Fortunately, the ESP32 official docs explain this in details here and here.
Here is what worked for me to flash my custom NodeMCU firmware into ESP32 from my Windows workstation:
I recently changed my old laptop for a new HP ProBook 450 G2. The drivers installation on my Windows 7 Home Premium (64-bit) were pretty straight forward, except for the following two devices which were shown as not supported in the Device Manager:
The first one turned out to be the driver for “HP 3D DriveGuard 6” which I didn’t install because my SSD drive won’t benefit from such a feature.
The second one was a more interesting case. It turns out that not all HP drivers were listed when I selected my operating system “Microsoft Windows 7 Home Premium (64-bit)” at the HP drivers website. I had to change the OS to “Microsoft Windows 7 Home Professional (64-bit)”, so that I get the option to download the “Intel Chipset Installation Utility” which resolved the missing drivers.
Here are some stats for one of the SSDs which we use at work. The most valuable S.M.A.R.T. attribute which we focus on here is the “Host_Writes_32MiB” which shows what capacity the SSD has written in its lifetime. This is the most important indicator because SSD drives have a limited total Program/Erase cycles (P/E cycles) they can sustain over the life of the flash memory, before they start to fail. According to Wikipedia, the number of those cycles is as follows:
Single-level cell (SLC) flash, designed for higher performance and longer endurance, can typically operate between 50,000 and 100,000 cycles.
As of 2011, multi-level cell (MLC) flash is designed for lower cost applications and has a greatly reduced cycle count of typically between 3,000 and 5,000.
Those numbers refer to the internal rewrites of the flash cells at the hardware level. If a cell is 32 MiB and we do a 3200 MiB write transfer from a computer, the actual write cycles of the flash cells in the SSD internally may be higher than 100, because of the complicated internal storage processes inside the SSD. You can review the “Logged S.M.A.R.T. Data” chart in this example.
Another thing to mention before we get to the actual statistics is that the “Power_On_Hours” S.M.A.R.T. attribute is correctly accounted and 100% matches the actual work time span of this SSD drive.
This sample SSD has the following S.M.A.R.T. attributes:
Host_Writes_32MiB: 3681999
Power_On_Hours: 41204
We used 40680 power-on-hours in our calculations, because the SSD was “on” a few days before we started using it in production.
I got my hands on the following server for a day, so I decided to measure its power consumption because the new Intel Xeon Processor E3 series look very promising. They support ECC memory and at the same time have “Intelligent, Adaptive Performance”, which in plain text means that they can power themselves down and thus save energy. Furthermore, their price and the price of the motherboards are fair as these CPUs seem to be meant to be used mainly in Desktop workstations. Having ECC support lets us use them in servers too. The only caveat is that those Sandy Bridge based Xeon CPUs support only single CPU configuration — so don’t try to find a dual-CPU motherboard.
BIOS settings are set up for optimal power savings without compromising performance. FAN control is enabled too. Room temperature is 21 degrees Celsius.
Power usage with different server utilization scenarios follows:
7W — power off; idle consumation, the IPMI is alive
39W — power on; Linux OS is idle
IPMI sensor readings: cooling FAN works with 1755 RPM — relatively quiet; CPU temperature is Low
45W to 60W — power on; moderate Linux OS usage
load average: 1.53; installing 200 new packets via “apt-get”
IPMI sensor readings: cooling FAN works with 1755 RPM — relatively quiet; CPU temperature is Low
130W — power on; full stress by “stress –cpu 16 –io 8 –vm 8 –vm-bytes 1780M –hdd 4″
load average: 36.00; I/O load: 100%, mostly write; CPUs busy @ 100%, 70% user, 30% system, all CPU cores are utilized
RAM: about 95% used, 30% cached; network load: 22 Mbit/s constant SSH transfer
IPMI sensor readings: cooling FAN works with 3100 RPM — much noisy; CPU temperature is Medium
Recently I needed to expand my wireless network range. The spot where I needed wireless and wired network coverage was too far away from my main wireless AP, so I also needed a gain antenna. It turned out that most wireless routers cannot use an external antenna, because their original one cannot be dismounted. That is how I ended up with the TL-WR741ND wireless router, which can be used with an external antenna and is also very cheap. In my local PC store they got a 7dB omni-directional antenna by Intellinet, so I got one of these too.
Design and hardware purchase were the easy part. The TL-WR741ND supports wireless bridge mode (WDS), but unfortunately it did NOT work out-of-the-box for me. The router joined the wireless network of my main Wi-Fi router, and I could see it there as “associated authorized”. However, the system log of the TL-WR741ND device was giving some DHCPC (probably “DHCP client”) errors and nothing worked as expected. I tried to join TL-WR741ND to both my ASUS routers (WL-520gC and RT-N10) but with no luck. I also tried to help the TP-LINK router by doing some setup as advised in the ASUS Wireless Router WDS
Configuration Guide, and at the How to Setup WDS with Asus RT-N16 and Linksys WRT54G article. This did not help and I reverted the changes on my ASUS routers in the end.
After I wasted 2 hours, I found a forum article where a guy had a similar issue and finally found a solution:
after 4 days unsuccessful testing client bridge (i need repeater bridge but not possible on my device…with ddwrt) on wr741nd(v2.4)/ddwrt, i found solution: install Gargoyle firmware v1.13.10, very intuitive and easy configuration (as repeater bridge), it works perfectly! Total time spent: 5 min.!
I confirm his solution — install and setup of the stable Gargoyle free router firmware solved my problem in a snap. Tested with a version 2.4 TL-WR741ND device, with Gargoyle version 1.4.5 for TL-WR741ND devices with version 1.x (firmware is compatible with version 2.x devices).
Monitoring and controlling relative humidity is important for humans health. Too low or too high humidity feels uncomfortable, but most importantly high moisture is a factor for growing mold in your home, which could be health threatening (according to EPA and CDC). I will not go into details on how to control humidity. Instead I’ll describe what motivated me to design and create my own temperature and humidity sensor which reports its readings every minute to a central Linux server.
The main requirements for my design were the following:
Affordable price, as I wanted to install four sensors.
Great accuracy both for temperature and humidity readings.
Over-the-air communication, as I wanted to be able to install a sensor even in my bathroom, where I can’t run data or power wires. Support for wired communication too, so that we can reduce the overall price by not installing the wireless module.
Data logging to a computer, because both temperature and humidity change with time, for example when you sleep in the room, and you can’t look at a mechanical temperature or humidity meter every minute, in order to write down the results.
Battery operated, in order to avoid any wiring.
Open-source hardware and software toolchain, so I chose Atmel AVR microcontrollers. I got sick of Microchip and their commercial C compilers.
To have fun with electronics but at the same time create the device as fast as possible, as free time turned out to be a pretty limited resource recently.
I managed to accomplish most of the requirements I set with two exceptions: the device operates only a month on batteries, and cumulatively I spent almost a week to design, solder, develop the firmware, and test the device. Now all the sensors operate from a wall-plug power adapter, and my hunger for environmental control in my house is satisfied.
I’ll now try to describe the whole process and the reasons behind my engineer decisions. Note that I’m an amateur hobbyist.
Idea and requirements
I wrote down all my thoughts in a text editor. Then re-designed all the sticky notes into requirements, and did so a few more times, in order to finally decide what I want to design and not get distracted by new random ideas in my head.
Power supply
I wanted the device to be able to operate both via USB, and thus be powered by 5V, as well as to be powered by an accumulator or a battery with an input voltage up to 12V, so that it could be used in a car too. I put a polarity protection diode D1 in series with the power line, so that an accidental polarity mismatch doesn’t burn out the power regulator. Such a protection diode must have very low voltage drop and thus low power loss, and the Schottky diode 1N5819 seemed like a good match.
Operating from a battery also means that the voltage regulator must be extremely efficient and with a low bias current consumption, which means that it should draw almost nothing while there is nothing connected to it at its output as a load. Most battery operated devices “sleep” during most of their life cycle, so their consumption is close to zero. I used the ultra low-dropout fixed voltage regulator LP2986-33, marked as U1 in the schematics. The whole circuit operates at 3.3V because of the XBee wireless modules, and also because operating at a lower voltage usually gives lower power consumption.
Since we can have two different power sources, there must be a way to choose which one is active. You can switch between the power sources using the PWR_SELECT jumpers.
Wired communication via USB
I wanted to have the option to use the sensors by directly connecting them to a computer. This way we could save the money for an XBee wireless module. I used the classical USB-to-Serial solution FT232R, which is also quite inexpensive and requires almost no external components. You can see it in the schematics as U2. Note that the I/O lines of FT232R must be configured to operate at 3.3V too. This is done by connecting pin 17, which is the internal 3.3V regulator of FT232R, to pin 4. The internal 3.3V regulator is not used for anything else, and in theory I could have powered the I/O lines, pin 4, directly from the main voltage regulator U1.
Wireless communication
The XBee modules is something I wanted to play with for a long time. They seem very easy to work with and are packed with all kind of features. Though in my case I’m not using almost any of them, not even the AES encryption which could secure the data channel. I’m using the Series 1 XBee low-power embedded RF modules (XB24), which have a power of 1 mW and 30 m indoor range. There are many comments in Internet that the indoor range of the XBee modules is poor and I can confirm that. The range really depends on what the signal must travel through. Sometimes you lose the link even through one wall, sometimes it can go through a few walls. The XBee & XBee-PRO OEM RF Module Antenna Considerations is a great article by the XBee manufacturers. After all, probably by using such a low-power module, we shouldn’t expect so great results. It works well in my apartment though — all rooms report to the central XBee module successfully. On the server’s side, the receiver, I first had an XBee with chip antenna, which I replaced with an XBee-PRO with whip antenna. This made no difference.
Wiring the XBee module is very easy. It requires no external components. If you read the PDF datasheet, you’ll see how many great features an XBee has. I’m using only three of them:
Sleep mode — the microcontroller puts the XBee to sleep by controlling the SLEEP_RQ pin 9.
Networking addressing — each XBee is configured with a unique address, so that the receiver on the server side knows which reading belongs to which sensor probe.
API operation — the receiver XBee module operates in an API mode, which is a frame-based protocol that provides greater flexibility and more control. For example, besides the received data payload, an API frame gives information about the sender’s address and the signal quality.
Temperature and humidity sensor
I wanted to interface the sensor directly using a digital protocol, so that we can minimize the ADC stuff and errors. The SHT11 turned out to be the sensor I was looking for:
Relative humidity accuracy: +/- 3% in the range 20% to 80% RH. The sensor comes fully calibrated.
Temperature accuracy: +/- 1.5 degrees Celsius in the range -15 to +65 degrees Celsius.
Digital two-wire interface.
Very low energy consumption: 80uW (at 12bit, 3V, 1 measurement/s).
The SHT11 is a bit pricey but works very easily and accurately out of the box, so I decided to go with it. There is a very good alternative at Sparkfun — the RHT03 humidity and temperature sensor (also known as “RHT-22”). There were some contradictive comments by Sparkfun users — some say it works very well, some doubt its accuracy. I haven’t tried it but have left space JP7 on the current board, so that at some later time I could solder one RHT03 and use it with the existing schematics.
One note about the SHT11 two-wire interface. Definitely use a pull-up resistor on the DATA wire, as advised in the PDF! I tried to do some magic by the microcontoller and failed. With the exception of the pull-up resistor, everything else worked with no other problems with the SHT11 sensor. The manufacturer Sensirion provides Sample code for the SHTxx sensors which turned out to be very useful. I was able to re-code it for the AVR GNU C compiler in a couple of minutes.
The CRC calculation got me a bit confused. There are multiple different ways to calculate a CRC checksum, and they all provide different results. Each CRC calculation depends on the selected CRC polynomial, which is something like a bit-mask that defines the algorithm for the CRC calculation. After lots of struggle, I finally found an excellent Online CRC Calculation web wizard, which also includes a hardware implementation example, and sample C and VERILOG implementations, which you can copy-paste in your program. Thank you Kay Gorontzi!
Microcontroller
Initially I worked with ATmega8. Then I switched to ATmega168 because of the much lower power consumption. I could have used any other Atmel AVR microcontroller which has USART, internal oscillator, and sleep mode. Though ATmega8 or ATmega168 are always available in my local electronics shop, so I chose one of them. Besides the lower power consumption, ATmega168 has one other major advantage for my application — the watchdog timer can wake the chip from sleep mode and directly execute an interrupt, thus not re-starting the program from the very beginning.
Firmware
I’m working on Windows 7 64-bit and used a USBasp programmer to download the code into the microcontroller. The whole development toolchain is packaged into the WinAVR suite. It includes the AVR GCC compiler and the avrdude programmer. I also downloaded a sample Makefile which makes compilation and firmware download easy.
The main loop of the program does two tasks — measures and displays the readings over the serial port (which goes to the USB or over-the-air via XBee), and sleeps for about 60 seconds. As already mentioned, I use the new feature of ATmega168 which allows for the Watchdog timer to generate an interrupt, which wakes the chip from sleep mode. This is very handy as it allows you to continue the program at the point where you put it to sleep. The sleep mode was something new for me; there are some URLs in the source code which show what online articles helped me to master it. Note that the XBee RF transmitter is also put into sleep mode, in order to save battery.
Data collector
All the sensor readings are collected to a Linux server over-the-air. I use an XBee Explorer USB by Sparkfun to connect the XBee receiver with the Linux server. The XBee is seen as a serial device on the Linux box. The frame protocol of the XBee API is easy to understand and I implemented a Perl script to parse it. Here is a sample reading which is received from one of my wireless sensors (0x0001 is the address of the probe standing outside of my apartment):
[Mon Dec 26 17:32:11 2011] RX_packet: source=0x0001, rssi=-55dBm (opt=0x00): 4.24;69.04
As you can see, now it’s winter here — 4.24 degrees Celsius temperature; 69.04% relative humidity (RH).
Board design
Both the schematics and PCB board were designed using Eagle PCB by CadSoft. This is a great piece of software. Most PCB factories accept Eagle board files directly. You’ll find my Eagle files in the Resources section at the end of this article.
Lessons learned
There are a few things which I discovered only once I already built and tested the schematics:
Battery-operated devices are hard to design — in theory my sensors were supposed to last for about 3 months with a 9V battery. In practice only one of them lasted for a month, the others – for a week.
Electronics components, boards and/or assembly could differ a lot — see above. Also one of the SHT11 sensors is sometimes giving CRC errors.
XBee indoor range is not excellent.
Research and development takes a lot of time, usually 2x or 3x the time you planned. Furthermore, building something with love takes even more time, but in the end it pays off with great results and satisfaction.
You can create an electronics device with a lower price than what is currently offered on the marked. But this has its price too — your time, and you get no guarantee whatsoever.
I had different plans for this blog article but it got so lengthy that I wrote it in four different days (and it’s Christmas now). The main idea was to sketch the device and all its components, and to show that they can work together as a finished product. If there is any interest by other people, I’m happy to answer to any questions.
I want to share my outstanding experience with Intel when I wanted to return my failed Intel X-25M SSD hard drive. The disk was purchased on Mar/2009 from Amazon, and Intel claim a 3-year limited warranty for this model. So I was almost at the end of the warranty. Here is a timeline of the actions:
Mon, 24/Nov/2011: First I contacted Amazon, because this is what Intel’s policy states — contact the OEM provider first. Amazon directed me to contact Intel directly, so I sent them an e-mail.
Tue, 25/Nov/2011: 24 hours passed and I got no reply, so I decided to try the Chat support by Intel. I had to wait until it was North America work hours time; before that Chat support was unavailable. My chat with the US chat support representative was quite unhelpful — they stated that they have no idea which support center is handling my request, because I am from Europe, and they are the North America support center. So I had to wait for an e-mail reply, or try to call the EMEA support centers by phone. It was late at night here in Europe, so I decided to wait.
Wed, 26/Nov/2011: Got reply by e-mail from the EMEA support center. We exchanged some e-mails.
Thu, 27/Nov/2011: Some more tips and communications with the EMEA support member. They finally decided that I won’t be able to fix the SSD drive myself, and it should be replaced. I received a Standard Warranty Replacement (SWR) order number along with instructions on how to send the defective SSD disk to Intel.
Fri, 28/Nov/2011: I shipped the SSD drive via DHL, as instructed. The DHL delivery was paid by Intel!
Mon, 31/Nov/2011: DHL delivered the package to the RMA center of Intel. It took 3 days.
Tue, 1/Oct/2011: Intel shipped back a package via DHL.
Wed, 2/Oct/2011: DHL delivered the package to me. It took 1 day — it seems that Intel used an express DHL delivery option!
I received a brand new SSD drive of the same model and size, and Intel kept me as a 100% satisfied customer! 🙂