Both intentions are, upon further inspection, sort of strange.
The first one is odd because it’s not entirely clear, even to researchers, why anything with some abnormally bright colors would be appetizing at all, given that when, say, the color blue appears in nature, it’s often a sign of spoilage or poison. And the second is a paradox: How could a food be made to look “more natural” by virtue of artificial additives?
Starting in early 19th century, it became increasingly more common for businesses to manipulate foods to give them a standardized, recognizable appearance: Bakers would whiten bread with chalk, dairy farmers would add a lead compound to milk to make it seem thicker, and, later in the century, meatpackers began to inject red dye into cuts to make them look fresher. (As unhealthy as these “ingredients” sound, the bigger risk was that they were masking mold or spoilage that could sicken or kill.) But one thing that made the revolution Hisano documents possible was the discovery, in the 1850s, of a vivid magenta dye made from the liquid left over after processing coal—a repulsive-sounding (but usually safe) additive that could be synthesized on the scale necessary for mass-produced foods.
Nowadays, manipulating foods’ colors is the norm (and much safer), and even a consumer expectation. Grocery stores know that only pristine-looking apples sell—hence the shiny wax coating that growers apply before shipping. Never mind that more “natural” apples, the ones straight from the orchard, vary in color and often have dents and bruises.
But, even as that’s happening, artificial coloring is still considered normal. Even today’s conscious consumers mostly aren’t asking for companies not to use dyes at all; they are just asking to substitute synthetic dyes with natural ones. They probably don’t want foods without any colors, which can look kind of ugly.