Kicking off a busy day for the Internet of Things market, Intel this morning is announcing the Atom E3900 series. Based upon the company’s latest generation Goldmont Atom CPU core, the E3900 series will be Intel’s most serious and dedicated project yet for the IoT market.

We’ve talked about Intel’s IoT efforts off and on over the past couple of years. Having largely missed the boat on mobile, the company decided it wouldn’t miss IoT as well, and as a result they’ve been making significant investments into the IoT market, treating it as a fourth pillar of their business. Their efforts have not gone unrewarded, as IoT revenue has continued to quickly grow over the years, and as of Q3’16, IoT is now Intel’s third largest business by revenue, ahead of non-volatile memory and behind only client and data center revenue. Furthermore with 19% year-over-year growth, IoT still looks like a growing market for the company.

Overall the company offers a number of IoT products, ranging from the tiny Curie to the relatively powerful Atom E-series. However to date, the company has never built a dedicated, high-end IoT die. The Atom E3800 series were Bay Trail, and the recently announced Joule devices use Broxton, Intel’s canceled-for-mobile Atom SoC. However today that will be changing, as the launch of the Atom E3900 series brings with it Intel’s first custom silicon targeting the roughly 6W to 12W market of more powerful IoT devices.

So what does an IoT-centric Atom look like? By and large, it’s Broxton and more. At its core we’re looking at 2 or 4 Goldmont CPU cores, paired with 12 or 18 EU configurations of Intel’s Gen9 iGPU. However this is where the similarities stop. Once we get past the CPU and GPU, Intel has added new features specifically for IoT in some areas, and in other areas they’ve gone and reworked the design entirely to meet specific physical and technical needs of the IoT market.

The big changes here are focused on security, determinism, and networking. Security is self-evident: Intel’s customers need to be able to build devices that will go out into the field and be hardened against attackers. Bits and pieces of this are inerieted from Intel’s existing Trusted Execution Technology, while other pieces, such as boot time measuring, are new. The latter is particularly interesting, as Intel is measuring the boot time of a system as a canary for if it’s been compromised. If the boot time suddenly and unexpectedly changes, then there’s a good chance the firmware and/or OS has been replaced.

Meanwhile also new to the E3900 series is what Intel is calling their Time Coordinated Computing Technology.  This high precision timing mechanism is to allow multiple Atom E3900s to be tightly synchronized, down to 1 microsecond. We’re told that this feature is of particular interest to Intel’s manufacturing partners, as a means to improve accuracy and synchronization between devices on a manufacturing line.

Finally, a bit more nebulous, the E3900 die also includes some changes specifically to improve determinism. What Intel is pitching here isn’t hard determinism, but a higher level of determinism for devices that need better guarantees about how soon an action will be completed. Apparently a big part of implementing this is at the cache level, with Intel noting that polling loops in particular were greatly impacted by this change.

All told, Intel will be offering 3 SKUs of the E3900, ranging from 6.5W to 12W. As relatively high power processors these aren’t meant for wearables and such, but rather primarily devices on mains power where additional intelligence is needed. In Intel terminology, the E3900 is focused on “edge” devices as opposed to “core” devices. The idea being that Intel wants to move out data processing to the edge of an IoT network – into sensors and such devices – as opposed to having to use a dumb sensor that sends data back for processing.

Intel’s 3 big markets here are the video/sensor, industrial, and automotive market. The first and last in particular are areas that the previous E3800 couldn’t readily compete in, due to a lack of processing power for image processing and video encoding. Thanks in big part to the Gen9 GPU and some E3900-specific Image Processing Unit (IPU) changes – the chip can support 15 1080p30 video inputs – Intel can now go after these markets. And that may be the biggest part of this story here for Intel: they haven’t had a part like the E3900 before. The Bay Trail based E3800 had a decent enough CPU, but it’s the E3900 where GPU computing and computer vision become viable, and this is the cornerstone of a lot of new functionality.

Of course, hardware is only part of the picture. Along with the E3900 itself, Intel will also be shipping a number of software libraries to help developers take better advantage of the hardware, and really, bootstrap the whole process. A good deal of this is on the image processing side, providing functions that tap into the hardware’s new image processing capabilities.

Finally, in the first-half of next year, the E3900 series will be joined by the A3900 series. This is an automotive-specific SKU that is rated for higher operational temperatures; 110C versus 85C for the E3900. As we mentioned before automotive is a big part of Intel’s efforts here, and that means matching automotive tolerances. Given the performance of these chips, we don’t get the impression that Intel’s entering the fully autonomous car market right now, but they are hoping to go after some lower hanging fruit with driver assistance and in-car entertainment systems.

POST A COMMENT

19 Comments

View All Comments

  • Meteor2 - Tuesday, October 25, 2016 - link

    I wonder why there's emphasis on things like 'fast gfx and media processing' with 'smooth rich graphics' when the intended use-cases don't have screens. Reply
  • CaedenV - Tuesday, October 25, 2016 - link

    So your robot can have an LCD face that lets it know when it is displeased with humanity and is about to start an uprising Reply
  • Murloc - Tuesday, October 25, 2016 - link

    well some of the use-cases must have screens since they cite that it supports 3 displays.

    I guess interactive things.
    Reply
  • name99 - Tuesday, October 25, 2016 - link

    I think their dream is that this thing controls your car entertainment system, and thus is driving, eg, a driver screen just below the driving wheel, a console screen between the two front seats, and a back passengers screen.
    Now whether it's cheaper in that role than an ARM solution (which may be multi-chip, but who cares) and whether usable software exists (*cough* Microsoft Sync *cough*) we shall see...
    Reply
  • Ej24 - Tuesday, October 25, 2016 - link

    it's probably more for digital image processing, video encode and decode, audio processing, etc. All of these things can happen without a display on the device doing the compute. GPU's are far better suited for certain tasks so it makes sense to include it even if it's not used for the typical user interface. I'm thinking a wifi connected home security system that can record, compress, and stream over the internet. That's probably a good example. Reply
  • supdawgwtfd - Tuesday, October 25, 2016 - link

    If your think NVR type duties then you may be surprised to know that the NVR itself requires 0 GPU functionality.

    All they do is save the stream supplied by the camera itself.

    So a nice fast CPU is what is needed not a GPU at all.
    Reply
  • State of Affairs - Tuesday, October 25, 2016 - link

    Nice to see new silicon coming from Intel. It should help spur competition.

    However, what is the status on Denverton? Anandtech had an article up back in July 15 discussing Denverton. The article included pictures of a developmental motherboard. But there was no information released during IDF in August.
    Reply
  • Jhlot - Tuesday, October 25, 2016 - link

    Where is the Goldmont with no EUs and lots of cores, the Avoton successor? Reply
  • State of Affairs - Tuesday, October 25, 2016 - link

    That's Denverton. Reply

Log in

Don't have an account? Sign up now