Knowledge graph evolution: Platforms that speak your language

Knowledge graphs are among the most important technologies for the 2020s. Here is how they are evolving, with vendors and standard bodies listening, and platforms becoming fluent in many query languages
Read More →5 technology trends for the roaring 20s, part 2: AI, Knowledge Graphs, infinity and beyond

You don't have to be a fortune teller to identify AI as the key trend for the 2020s. But there is nuance regarding AI hardware and software that deserves to be highlighted.
Read More →Quantum Orchestration Platform: A virtual machine for quantum computing

Startup Quantum Machines unveils what could be a key to accelerating quantum computing adoption, already used in production
Read More →5 technology trends for the roaring 20s, part 1: Blockchain, cloud, open source

Data is changing the paradigm in everything from business to social interactions. Here is what will shape the data landscape for the years to come.
Read More →Deep Learning Software vs. Hardware: NVIDIA releases TensorRT 7 inference software, Intel acquires Habana Labs

NVIDIA's software library latest release brings significant performance improvements, which NVIDIA says enable conversational AI. But Intel is stepping up its game too, by acquiring Habana Labs, an AI chip startup that promises top performance on the hardware level.
Read More →Why autonomous vehicles will rely on edge computing and not the cloud

When driving a vehicle, milliseconds matter. Autonomous vehicles are no different, even though it may be your AI that drives them. AI = data + compute, and you want your compute to be as close to your data as possible. Enter edge computing.
Read More →Unifying cloud storage and data warehouses: Delta Lake project hosted by the Linux Foundation

A mix of open source foundations and commercial adoption, the strategy adopted by Databricks for Delta Lake could set Delta Lake on its way to becoming a standard for storing data on the cloud
Read More →NVIDIA’s AI advance: Natural language processing gets faster and better all the time

Yesterday NVIDIA announced record-breaking developments in machine learning for natural language processing. How and why did it do this, and what does it mean for the world at large?
Read More →Forget silicon – SQL on DNA is the next frontier for databases

A couple of years back, even researchers would wave off using DNA to store data as something too futuristic to have any practical value. Today, you can extend PostgreSQL with the right software and bio-chemical modules, and run SQL on DNA.
Read More →Advancing human exploration: Is space the final frontier, and how can data and AI get us there?

Fifty years after the moon landing, it's not just NASA working on what many consider the final frontier for humanity: space travel. NASA, however, is special, and one of the reasons is that data is at the heart of what it does.
Read More →