Unless you’ve been extremely lucky, you’ve likely been wounded, be it a knife cut while cooking or a sports injury. To remedy this unpleasant experience, you’ve taken some version of the following steps: clean the wound, disinfect the area, and apply a plaster or bandage. While a common and simple first-aid skill, this wound healing process has existed since ancient times.
Furthermore, there are wound cases, especially chronic wounds that arise from conditions such as diabetes, that can be more severe than one might expect. The 5-year survival rate of patients with chronic wounds is about 70%, which is worse than that of breast cancer, prostate cancer and other diseases. In addition, treating wounds adds to the cost of care, leading to about $28 billion per year in the U.S. alone.
Following the traditional use case, the main function of bandages for acute or chronic wound care has been to protect the injured area from external factors that could worsen the injury, such as dirt, bacterial infection and friction. Over the centuries since the inception of wound dressing, some changes have taken place. These have mostly related to the material of bandages, such as stronger-adhering waterproof ones; but the role of the bandage has retained its passive role.









