Ransomware is a form of malicious software that locks up the files on your computer, encrypts them, and demands that you pay to get your files back. Wanna Decryptor, or WannaCry, is a form of ransomware that affects Microsoft’s Windows operating system. When a system is infected, a pop up window appears, prompting you to pay to recover all your files within three days, with a countdown timer on the left of the window. It adds that if you fail to pay within that time, the fee will be doubled, and if you don’t pay within seven days, you will lose the files forever. Payment is accepted only with Bitcoin.

How does it spread?

According to the US Computer Emergency Readiness Team (USCRT), under the Department of Homeland Security, ransomware spreads easily when it encounters unpatched or outdated software. Experts say that WannaCry is spread by an internet worm — software that spreads copies of itself by hacking into other computers on a network, rather than the usual case of prompting unsuspecting users to open attachments. It is believe that the cyber attack was carried out with the help of tools stolen from the National Security Agency (NSA) of the United States.

Some forms of malware can lock the computer entirely, or set off a series of pop-ups that are nearly impossible to close, thereby hindering your work.

What can be done to prevent this?

The best way to protect your computer is to create regular backups of your files. The malware only affects files that exist in the computer. If you have created a thorough backup and your machine is infected with ransomware, you can reset your machine to begin on a clean slate, reinstall the software and restore your files from the backup. According to Microsoft’s Malware Protection Centre, other precautions include regularly updating your anti-virus program; enabling pop-up blockers; updating all software periodically; ensure the smart screen (in Internet Explorer) is turned on, which helps identify reported phishing and malware websites; avoid opening attachments that may appear suspicious.

Who has it affected so far?

It was first reported from Sweden, Britain and France, but Russia and Taiwan are said to be the worst hit, according to US media. Over 75,000 systems have been affected. Major companies that have reported attacks are FedEx, Telefonica and National Health Service (UK).

Now, turn any surface into a touch screen

Spraying a coating of electrically conductive material does the trick, researchers show

Scientists have developed a new technology that can turn any surface — including walls, furniture and steering wheels — into a touch screen using tools as simple as a can of spray paint.

The “trick” is to apply electrically conductive coatings or materials to objects or surfaces, or to craft objects using conductive materials, researchers said.

By attaching a series of electrodes to the conductive materials, researchers from Carnegie Mellon University in the U.S. showed they could use a well-known technique called electric field tomography to sense the position of a finger touch.

A simple idea

“For the first time, we have been able to take a can of spray paint and put a touch screen on almost anything,” said Chris Harrison, Assistant Professor at Carnegie’s Human – Computer Interaction Institute.

With the new technology dubbed Electrick, conductive touch surfaces can be created by applying conductive paints, bulk plastics or carbon-loaded film.

Yang Zhang, a PhD student at the institute, said that Electrick is both accessible to hobbyists and compatible with common manufacturing methods, such as spray coating, vacuum forming and casting/molding, as well as 3D printing.

Like many touchscreens, Electrick relies on the shunting effect — when a finger touches the touchpad, it shunts a bit of electric current to ground.

By attaching multiple electrodes to the periphery of an object or conductive coating, Mr. Zhang and his colleagues showed they could localise where and when such shunting occurs. Electrick can detect the location of a finger touch to an accuracy of one centimetre, which is sufficient for using the touch surface as a button or slider, Mr. Zhang said.

Bionic hand

AI is slowly finding its way to prosthetic limbs

Artificial Intelligence (AI) is transforming our lives; and not in the mad scientist or apocalyptic science fiction kind of way. Well, at least not yet. From home automation to driverless cars, AI is at the forefront of many a pioneering technology—all aimed at making life easier.

Now, AI is slowly finding its way to prosthetic limbs. Again, it’s not at the level you may have seen in movies (like Dr. Octopus in Spider-Man and his metallic tentacles), but marked improvements in prosthetic limbs and their functioning have been made.

Researchers at Newcastle University, United Kingdom, have engineered a limb—a hand, specifically—that can “see” objects for itself.

This is enabled by a camera affixed to the hand’s knuckles, that takes in the object close to the hand and the hand reacts by grabbing the object—all in a matter of milliseconds. According to the abstract of the study submitted to the Journal of Neural Engineering, the engineers used a deep learning-based artificial vision system to improve the hand’s functionality.

The engineers trained a neural network structure (a system modelled along the lines of the human nervous system) with pictures of about 500 objects that can be grasped. Objects were classified into four different classes of grasps and each object had 72 images to help the neural network identify it in detail.

When tested on two amputee volunteers, they were able to grasp and move the targeted objects with an 88% success rate, the abstract noted. The user can also override the bionic hand’s functioning and control grasping actions by themselves.

This study could pave the way to much more advanced breakthroughs in prosthetic limbs, such as connecting them to nerve endings to enable direct control over the limb. Now, that sounds like something out of Spider-Man, after all.

> Camera affixed to the knuckles takes a picture of the object in front

> Neural networks help the limb identify the object and grasp it

> Reaction time is just a few milliseconds

> Programmed to perform four different “grasps”—picking up a cup, holding a TV remote, gripping objects with thumb and two fingers, or a pinched thumb and first finger

IIT-M makes white light from pomegranate, turmeric extracts

This could be used in applications such as tunable laser, LEDs and white light display

Dr. Vikram Singh, former research scholar in the Department of Chemistry, IIT Madras won the BIRAC Gandhian Young Technological Innovation (GYTI) Award 2017 for his work on producing white light emission using natural extracts.

Dr. Singh and Prof. Ashok Mishra from the Department of Chemistry, IIT Madras used a mixture of two natural extracts — red pomegranate and turmeric — to produce white light emission. The researchers used a simple and environment-friendly procedure to extract dyes from pomegranate and turmeric.

While polyphenols and anthocyanins present in red pomegranate emit at blue and orange-red regions of the wavelength respectively, curcumin from turmeric emit at the green region of the wavelength. White light emission is produced when red, blue and green mix together. This is probably the first time white light emission has been generated using low-cost, edible natural dyes. The results were published in the journal Scientific Reports.

“We had to mix the two extracts in a particular ratio to get white light,” says Dr. Singh, the first author of the paper; he is currently at Lucknow’s CSIR-Central Drug Research Institute (CDRI). By changing the concentration of the two extracts the researchers were able to get different colour temperature (tunability).

“When we mix the two extracts and irradiate it with UV radiation at 380 nm, we observed energy transfer (FRET mechanism) taking place from polyphenols to curcumin to anthocyanins, which helps to get perfect white light emission,” says Dr. Singh. For FRET mechanism to take place there must be spectral overlap between the donor and acceptor.

Energy transfer

In this case, there is a perfect overlap of emission of polyphenols with absorption by curcumin so the energy from polyphenols is transferred to curcumin. Since there is also a perfect overlap of emission of curcumin with absorption by anthocyanin, the energy of curcumin is transferred to anthocyanin.

As a result of this energy transfer from one dye to the other, when the extract is irradiated with UV light at 380 nm (blue region of the wavelength), the polyphenols emit in the blue region of the wavelength and transfers its energy to curcumin. The excited curcumin emits in the green region of the wavelength and transfers its energy to anthocyanin, which emits light in the red region of the wavelength.

“Because of the energy transfer, even if you excite in the blue wavelength we were able to get appropriate intensity distribution across the visual wavelength,” says Prof. Mishra, who is the corresponding author of the paper.

Without turmeric

Taking the work further, the duo produced carbon nanoparticles using pomegranate and to their surprise it was producing fairly green emission. So instead of using turmeric to get green wavelength, the researchers used carbon nanoparticles made from pomegranate extract. “We could get white emission, though it is not as white as when we use turmeric. It’s slightly bluish but well within the white zone,” says Prof. Mishra. “It is an attractive to use a single plant source to create white light emission.” The principle by which the pomegranate extract and carbon nanoparticles made from the extract is the same as in the case when pomegranate and turmeric extracts were used. The results were published in the Journal of Materials Chemistry C.

Though this natural mixture of dyes can be used in a wide variety of applications such as tunable laser, LEDs, white light display, much work needs to be done in terms of photostability and chemical stability before it becomes ready for translation. Biosystems have an inherent tendency to breakdown and so this has to be addressed.

India launches satellite to help South Asian nations

Leaders of neighbouring countries, barring Pakistan, join Modi via video conference in celebrating the successful launch by ISRO.

South Asia Satellite or GSAT-9, termed India’s technology largesse from the sky to the peoples of the region, was flown into space on a GSLV rocket at 4.57 p.m. on Friday.

In a televised teleconference with Prime Minister Narendra Modi soon after the launch, leaders of the six benefiting nations hailed the gesture as a new face of cooperation in space for common good of the neighbourhood.

War-ravaged Afghanistan alone does not share a border with India. Its Prime Minister Ashraf Ghani said, “If cooperation through land is not possible, we can be connected through space.”

The 2,230-kg communication spacecraft will support communication, broadcasting and Internet services, disaster management, tele-medicine, tele-education, weather forecasting in a region that is geographically challenging, economically lagging with limited technological resources, they echoed in their addresses.

Free services

The spacecraft and the launch are estimated to have cost India around ₹450 crore. Its applications touch everyday life and the neighbours use its applications free of charge.

About 17 minutes after the launch, GSAT-9/ South Asia Satellite carrying 12 Ku band transponders was put into a temporary oval orbit on the GSLV-F09 rocket from Sriharikota in coastal Andhra Pradesh. Indian Space Research Organisation later said the major phases of the flight took place as planned.

South Asia Satellite now orbits Earth in an oval orbit 169 km at the nearest and 36,105 km at the farthest, “with an orbital inclination of 20.65 degrees with respect to the Equator.”

The orbit will be made circular through manoeuvres from the Master Control Facility in Hassan in Karnataka.

Soon after it was released from the last stage of the rocket, GSAT-9’s two solar arrays opened out automatically and its engineers at the MCF took charge of the satellite. It will start working after all its instruments are switched on and tested in the coming days.

Common goals

Prime Minister Narendra Modi tweeted the launch immediately. He congratulated ISRO on achieving a flawless lift-off and said the launch fulfills India’s promise of July 2014 of a regional satellite. “With this launch we have started a journey to build the most advanced frontier of our partnership. With its position high in the sky, this symbol of South Asian cooperation would meet the aspirations of economic progress of more than 1.5 billion people in our region and extend our close links into outer space.”

South Asian leaders pat Modi for gifting satellite to the region

Afghanistan, Bangladesh, Bhutan, The Maldives, Nepal and Sri Lanka, along with India, “will together achieve effective communication; better governance, better banking and better education in remote areas;more predictable weather forecasting, land monitoring and efficient resource mapping; linking people with top end medical services through tele-medicine; and a quick response to natural disasters.”

Apart from Dr. Ghani, Sheikh Hasina Wazed of Bangladesh; Tshering Tobgay of Bhutan; and Pushpa Kamal Dahal of Nepal; and Presidents Abdulla Yameen Abdul Gayoom of The Maldives and Maithripala Sirisena of Sri Lanka took part in the teleconference.

Vice-President M. Hamid Ansari congratulated the space agency on the launch and enabling cooperation through space.

100 years with our closest star

Indian Institute of Astrophysics releases digitised images of the sun for researchers and science enthusiasts

Every day, since 1904, staff at the Kodaikanal Solar Observatory in Tamil Nadu have aimed their telescope at the sun, freezing the images of its disc. This data, spanning a hundred years and more, has now been digitised by astrophysicists from the Indian Institute of Astrophysics, Bengaluru, and made available to the public.

Apart from use in academic studies of long term behaviour of the sun, the data can be used to better understand sunspot activity which impacts climate and affects telecommunication systems. It also throws light on major events in the past which had an impact on the earth’s magnetic field. “From that knowledge we may understand the current and future events with greater precision. This also allows us to predict future [sunspot] activity levels with better accuracy,” says Dipankar Banerjee, IIAP, the Principal Investigator.

While ‘spectroheliograms’ were taken at the Kodai observatory since 1902, it was in 1909 that the data was used to discover the Evershed effect – that gases in the sunspots flowed radially outwards. The discovery by John Evershed put the KSO at par with the best observatories in the world. But its importance eventually declined as it was not upgraded or maintained. In a backhanded way, though, this turned out to be beneficial, because “the pictures had all been taken with the same instrument over the years, and this made it much easier to calibrate and digitise,” says Sudip Mandal, a Ph.D student who has worked on the project.

The data is unique not only in that it spans a hundred years, but that there are three sets of images, taken using different filters – White light, H-alpha and Calcium-K. It is known that the sun has a layered structure, and each of these data sets exposes a different layer.

Under white light filtering, the sun’s photosphere and the sunspots are visible, while the Calcium-K light can show layers some 2,000 km above this, in the chromosphere. The H-alpha images show up layers a little above the Calcium-K images. Features called “filaments” which are related to large expulsions of material from the sun’s surface can be viewed in the Calcium-K sets.

Opening up the digitised data has attracted international attention: Max Planck Institute, Gottingen; National Astronomical Observatories of China, Beijing and Big Bear Solar Observatory, US are interested in studying the way the sun’s luminosity changes. Though the sun appears to have a steady brightness, its luminosity actually undergoes changes over time. Some of the groups. The Big Bear Solar Observatory and the Beijing teams are interested in the H-alpha data in order to study the filaments that can be observed in those shots. Within India, groups from IUCAA, Pune; Physical Research Laboratory, Ahmedabad; and IISER, Kolkata, want to make studies.

A movie that the scientists made out of a sequence of hundreds of white light images shows how the sunspots appear and disappear periodically over an eleven-year cycle. Such movies offer immense possibilities for developing educational software, as classes of students can visually experience how the sun and the sunspots behave over the years. Just like CERN offers its data to science hobbyists, for analysis that does not require much training and yet cannot be carried out without human intervention, this data, too, could be used by science fora in India to build citizen science projects.

The data was historically archived in photographic plates and film. After the digitisation, the images are preserved in high-resolution digital format. “We store it in FITS [flexible image transport system] which is the most commonly used digital file format,” clarifies Dr Banerjee.

Digitising this has been a challenging task wthat involves not just reading and displaying the image but also extracting information – for instance differentiating a sunspot from artefacts such as a scratch or a fungal streak. “It can only be done using a lot of sophisticated mathematical tools. Some are available some we have had to develop to handle these challenges,” says Dr Banerjee.

This data can be freely downloaded from https://kso.iiap.res.in and wis also available on request through the contact details given on this website.

The project which was initiated about six years ago by S.S. Hasan, then the director of IIAP, has succeeded in converting to digitised format some sixty-seventy thousand images previously stored in photographic plates. The team includes scientists and the big team of research assistants at the Kodaikanal lab.

At the moment, the group has released the “lowest level” or raw data and plans are on to eventually release the processed ones, too.

Do your jeans have a Bluetooth connection?

Paris show displays ‘smart’ denims that can give you street directions and send e-mail alerts

A young man in a white T-shirt pulls on a dark blue denim trucker jacket, tucks his smartphone in an inside pocket and puts in-ear headphones in his right ear.

He mounts a fixed-gear bike with flat, slightly curved wide handlebars. Riding through the streets of San Francisco, he occasionally taps or swipes his right hand over the left cuff of his jacket, as the directions he’s listening to continually pop up on the screen of this advertisement.

It’s an ad from iconic U.S. jeans maker Levi Strauss for Project Jacquard, an initiative with Google that the companies started two years ago for so-called “smart” denim.

The future of the popular fabric was the focus at a recent international fashion fair in Paris.

Wearable innovations

The fair featured many wearable innovations such as a waterproof jacket with sunscreen bands and a cable in the pocket to recharge a cellphone, or jeans that keep your body temperature stable. Once mainly the purview of athletic gear — with moisture-wicking shirts and trousers and then clothing that can track motion, heart rate, and body temperature — the new trend for fashion designers is to take everyday wear and transform it using new technologies.

French-based fashion company Spinal Design, for example, has created jeans that can give wearers directions without having to whip out the mobile at every single intersection.

Through Bluetooth sensors stitched into the jeans’ waistband, the smartphone stays out of sight.

“Sensors will vibrate right if you need to turn right, left if you need to turn left,” said Spinal’s innovation director Romain Spinal.

In 2015, the company designed a bikini that tells women when it’s time to apply more sunscreen. The two-piece retails for €149 euros (₹10,500) and comes with a detachable ultraviolet sensor that, through a smartphone or tablet, sends a “sunscreen alert” when the sunbather’s skin needs more cream.

The Spinal jeans, made in France, cost €150 euros and also have e-mail notification capabilities. “They will vibrate differently depending on whether the message received is from your family, your friends or work, in a way that you won’t have to constantly check your e-mail on weekends or on vacation,” Mr. Spinal said.

On their end, Google and Levi expect to release their denim jacket sometime this year, but it will come with a hefty $350 (₹22,500) price tag due in part to its special interactive fabric that allows the jacket’s wearer to order various products online.

Researchers develop synthetic soft retina

The replica is made of hydrogels and cell membrane proteins

Scientists from the University of Oxford have developed a synthetic, soft tissue retina that closely mimics the natural retinal process.

The researchers believe that their efforts could lead to the development of less invasive products that closely resemble human body tissues, helping to treat degenerative eye conditions such as retinitis pigmentosa. The condition changes how the retina responds to light, causing people to slowly lose vision.

Until now, artificial retinal research has used mostly rigid, hard materials.

“The human eye is incredibly sensitive, which is why foreign bodies like metal retinal implants can be so damaging, leading to inflammation and/or scarring. But a biological synthetic implant is soft and water-based, so much more friendly to the eye environment,” said lead researcher Vanessa Restrepo-Schild from Oxford University.

Just as photography depends on camera pixels reacting to light, vision relies on the retina performing the same function.

The retina sits at the back of the human eye, and contains protein cells that convert light into electrical signals that travel through the nervous system, triggering a response from the brain, ultimately building a picture of the scene being viewed.

The synthetic, double-layered retina replica consists of soft water droplets (hydrogels) and biological cell membrane proteins.

Designed like a camera, the cells act as pixels, detecting and reacting to light to create a grey scale image.

“The synthetic material can generate electrical signals, which stimulate the neurons at the back of our eye just like the original retina,” Ms. Restrepo-Schild said. The study was published in the journal Scientific Reports.

Haptic Suits

A haptic suit (also known as tactile suit, gaming suit or haptic vest) is a wearable device that provides haptic feedback to the body.


virtual reality reinvented

The world’s first full-body haptic feedback, motion capture, thermo controlled suit. Enjoy incredible real world sensations as never before.


Haptic Feedback system

Feel a wide range of sensations across your whole body whether the soft touch of warm rain, a heavy impact or even the freezing cold.

The Teslauit haptic library provides a range of sensations to targeted areas across the body including simultaneously stimulating multiple muscle groups.

Motion Capture system

Teslasuit motion capture system use sensors to transfer the precise position of the body into virtual environments. Don’t think about latency and your actual position in VR location. Teslasuit will think instead of you!


Because the T-Suit can fully mimic a user’s body movements in AR/VR environments, a digital representation (avatar or “skin”) of the user can be created – allowing a further degree of personalisation.

Climate Control system

The suit’s climate control system provides the user with extra realistic immersive sensations. Through its revolutionary heating and cooling elements rapid changes in temperature can be created within mere seconds to simulate anything from a walk in the sun to the freezing chill of an arctic wind, or from forest fires to bomb explosions.

Streaming service

Imagine a fully integrated, computer-free version (all data and graphics processing performed by the Tesla remote VR-cloud), streaming content directly to the User’s virtual/augmented reality system, that would work together with Teslasuit models remotely and wirelessly.


Teslasuit come with the software and hardware that help developers and users to create virtual sensations. Build you own haptic presets, customize existing sensations, adjust every detail, save and share with your friends.




  • Size | One-Size-Fits-Most
  • Weight | 3.5 lbs
  • Haptic Zones | 16
  • Power Supply | 9V
  • Data | USB 2.0/3.0
  • Audio | 3.5mm


  • Low Latency
  • 16 families of variable, programmable, and combinable haptic effects


The Hardlight suit is a haptic feedback jacket designed for virtual reality. It allows a user to interact physically with the virtual world around them.

Haptic feedback is the sense of touch. Haptic feedback for virtual reality allows you to feel when you’ve been hit, shot, knocked, touched, brushed up against, …you get the idea. It adds the sense of touch to the virtual world.

The Hardlight suit weighs about 3.5 pounds – similar to a light jacket. The weight is distributed across your body, so when you’re wearing the suit, you won’t even notice it.

The Hardlight suit provides vibro-tactile feedback using technology similar to the vibrate function of most cell phones. The team explored many different methods for delivering feedback, and vibratory feedback was discovered to be the most cost-effective.

Unlike a rumble backpack, the Hardlight suit has 16 individually-activated/-tracked haptic zones, which allow you to interact with virtual reality in 3D space.

Location-Based Haptic Feedback

Feel a sword clash against your armor. Box with friends. Reach out and touch the world around you.

Integrated Tracking

Track your entire upper body in virtual space.
See your avatar, not just a floating head and hands.

Comfort and Flexibility

Adjust the suit to fit almost any body type, and play active games without overheating or getting tired.

Multi-Platform with Easy API

Compatible with all PC-based VR headsets. Includes a developer API for easy development and integration.