Wednesday, August 26, 2009

The Age of Adaptability

Part 8 of Prisoners of the Real


Successful organizations, according to the equilibrium theorists of the 1950s, strike a balance between inducements and contributions. Inducements are "payments" made by the organization – wages paid to workers, services provided to customers, or income produced for investors. Though they may vary widely, inducements can be measured independently of their use to those who receive them, and have a specific value in each situation. Contributions, on the other hand, are payments made to the organization, things such as work from the workers, fees from clients, or capital from investors. They can also be measured and have "utility" values.


The main idea was that individuals and groups receive inducements from the organization, in return for which they make contributions to it. A related idea was that organizations must balance production and satisfaction, known in systems parlance as group need satisfaction (GNS) and formal achievement (FA).


Striking the proper balance between inducements and contributions is the main trick. In equilibrium theory, success depends primarily on the answers to two questions: how desirable is it to leave the organization, and how easy is it? If the inducements are greater than the contributions, people should be less likely to leave. If the inducements are too great, however, the organization will very likely go under.


Management theorists think mainly in terms of behavior. In the case of equilibrium theory, they attempted to take into account the "subjective and relative character of rationality," something that had been missed by classical theorists. Nevertheless, they relied on a theory of rational choice that continued to simplify real situations. People make choices, they argued, either as a routine response to some repeated stimuli or as a problem-solving response to a novel situation. Still, an acceptably "rational" choice might be simply "satisfactory" rather than "optimal."


This distinction was critical. An optimal choice is only possible if all conceivable alternatives can be compared and the preferred one freely selected, a relatively uncommon state of affairs. A satisfactory choice, on the other hand, can be made with minimally acceptable alternatives. Put another way, you can search through a haystack for the sharpest needle, but most people will settle for any needle that is sharp enough. Rational behavior, as defined by equilibrium theorists, called for simplified situations that captured the basic aspects of a problem without worrying about all the complexities.


In addition to this "satisficing" approach to choice, equilibrium theory considers several other aspects of rational behavior. Various actions and their consequences, for example, are examined sequentially, aided by systematic tools such as checklists. In recurring situations, routines are developed to serve as satisfactory alternatives. These routines can be based on past experience, and can break tasks into their elementary steps. Under the assumption that routines can be adapted to meet any organization's objectives, matters such as the time needed, the appropriate activities and the specifications of the product itself become merely technical questions. As a consequence, however, individual discretion is severely limited to matters of form, and routines are often executed in a semi-independent fashion.


Basically, rational behavior is supposed to be goal-oriented and adaptive. It should deal with a few things at a time, under the assumption that simultaneous action to both maintain and improve things isn't normally possible. Thus, a "one-thing-at-a-time" approach, with each thing placed in an orderly sequence, becomes fundamental to the structure of the organization. Speaking for many management professionals in the post-war period, equilibrium theorists asserted that work groups can only function effectively within "boundaries of rationality," and only if they recognize that "there are elements of the situation that must be or are in fact taken as givens."


By the 1970s, the burgeoning science of management had spun off a variety of approaches to organizational and human resource development. While not all of them classified managers as scientists, most did urge them to appreciate and emulate scientific methods and attitudes. The scientist, a model of educated skepticism and caution, attempts to arrive at verifiable information through logical and experimental procedures. Results rest upon premises that are themselves verifiable, or at least aren’t inconsistent with the body of recognized scientific information. Any manager who adopts this style must accept the intrinsic superiority of the analytical framework of thought. He has to accept as reality what he sees, and also believe that by describing how the parts work he can eventually gain control of the whole.


Two of the most prominent schools of management science that evolved were the behaviorist and operations research. The former is inductive and problem-centered, focusing on observable and verifiable human behaviors within organizations. Behavior-oriented management emphasizes how decisions are made, and focuses on describing the nature of operations. The latter, known as OR, is deductive, and looks at decisions in terms of how they ought to be made.


Operations research prescribes the nature of operations, often by building a mathematical model of the problem. Behavioral theory draws from psychology, sociology and anthropology within a framework of scientific procedure. OR is tied to mathematics, statistics and economics in its search for optimal solutions. Over the past few decades most management practice has incorporated aspects of both.


Although most leaders say they believe that there are limits to the use of scientific method, they continue the search for fully refined "operational" methods. Despite the increasing complexity of organizational life, management's bottom line remains its faith in the possibility of constructing systems based on what philosophers call "the true nature of the real."


The first step for any organization is, of course, planning. For rational managers, the planning process involves mainly defining the means necessary to reach desired ends. Plans provide standards against which performance can be measured; they are the basis of organizational control, and their main value to most managers is their ability to help reduce ambiguity.


There are certainly various kinds of planners. A short-range planner, for example, who thinks that only the objectives visible on the horizon can be realized, might be called "practical" or "realistic." A long-range planner, in contrast, might argue that the only truly important objectives are those beyond the visible horizon. Such a planner would undoubtedly be called a "visionary" or "idealist." The difference between these apparent extremes, however, isn’t as great as it sounds.


In government and the corporate world, most idealists and realists agree that the main tasks are to prepare early and be ready to deal with rapid change. For both types, planning is "a process of preparing for the commitment of resources in the most economical fashion and, by preparing, of allowing this commitment to be made faster and less disruptively." That definition, provided by Columbia's E. Kirby Warren in his book, Long Range Planning: The Executive Viewpoint, assumes that the purposes of planning are to speed the flow of information, help managers understand how today's decisions will affect the organization, and keep disruptions to a minimum.


The central goal of planning is to increase predictability. In order to do this, however, managers must often depend on others. As the economist F. A. Hayek explained, "The more complicated the whole, the more dependent we become on that division of knowledge between individuals whose separate efforts are coordinated by the impersonal mechanisms for transmitting the relevant information." This principle of "bounded rationality" has led many managers to favor decentralized approaches to planning, since usually no one person can balance all the considerations that have a bearing on important decisions. But delegation often leads to conflicts, since various analyses must be combined in one general plan. Thus, despite efforts to decentralize planning, decision-making must remain relatively centralized.


In order to resolve conflicts, managers rely on "satisficing" techniques such as performance measurement and estimates of costs and returns. In educational and other human resource organizations, administrators call such an approach "participatory planning." This may involve the sharing of decision-making authority, less intrusive supervision, and more accessible relationships. The goals are generally improvement of efficiency, orderly operations, better staff morale, and positive relations between the administrator and the public.


Despite a participatory style, which is basically a way to divide the labor of planning in order to solve problems more rapidly, managers and administrators remain the regulators of the process, bargaining or relying more heavily on analysis as circumstances dictate. In either case, rational planning assumes that if people pursue their various self-interests within an appropriate structure, they will be led by an invisible hand to promote ends that may have little or nothing to do with their intentions.


Next: Living with Rational Management


Previous:

The Creative Also Destroys

Deconstructing Leadership

Anatomy of Insecurity

Managers and Their Tools

The Corporate Way of Life

The Dictatorship of Time

Rules for Rationals

Thursday, August 20, 2009

Rules for Rationals

Part 7 of Prisoners of the Real


Most inhabitants of developed societies are nervous but faithful rationalists, living uneasily at the opening of the 21st century on the intellectual capital accumulated in the 17th. The central item of their faith – in fact, the motive for most scientific pursuits – is an assumption that they will never find any phenomenon that is intrinsically incapable of exhibition as an example of some general theory.


To some extent, most human beings are also managers when exerting control over their own affairs. Using argument and analytical thinking, however, "rational managers" go further: they seek to alter and shape our world, attempting to describe infinite complexity by examining its constituent parts and very often using language that would be more appropriate for the description of machines.


For most of the last century the literature of management and organization theory has defined managers as predictors and controllers of events. Whether they focused on the process or the product, whether they stressed the importance of handling tasks or tending to maintenance, the leaders of organizations were expected to regulate and organize the activities of those "below" them and in some instances even their own co-workers.


In an essay on "organizational socialization and the profession of management", Edgar Schein, a professor at MIT's Sloan School of Management, once explained that, "The essence of management is to understand the forces acting in a situation and to gain control over them." When a student attended Sloan or other business schools, taking in the general principles of behavioral science, economics and quantitative method, the main lesson was to look at things from the perspective of high-level management, adopting a stance of pure rationality and emotional neutrality. Faced with any problem, they were expected to analyze it and come to a decision free of feelings about people, the product, the company or the community. Such was the professional manager's value system.


This model of the manager – an ethically-neutral, rational and objective professional who depends on reason, respects order and seeks above all to establish clear boundaries – is almost universally accepted among both theorists and practitioners of management science. Even the more recent definition of managers as coordinators and enlightened questioners conforms to the basic model.


According to Charles Kepner and Benjamin Tregoe, authors of the classic leadership manual titled The Rational Manager, the leaders of organizations are increasingly separated from the technical knowledge and skills of those they direct. As a result, they suggest that managers develop skills in asking questions, an approach that requires less experience and specialized knowledge. "The manager today has no choice," they claim. "With management growing progressively more complex, and experience more obsolete more rapidly, the manager must rely more and more on skillful, rational questioning, and less and less on experience."


This approach, however, differs only in its techniques from the classical style. The centerpost is still operational: how to make workers more efficient in the pursuit of institutional goals. The basic values – self-control, neutrality, discipline and efficient production – have been unchallenged since Frederick Taylor articulated the principles of scientific management in 1895.


Taylor trained as an engineer and learned the ropes with large steel companies. Growing out of his own experiences, his theory of scientific management rested on four basic steps. The first, based on the notion that there is always "one best way" to do a job, was to analyze the ideal method. Next, the manager was to select the "right man" for each job and train him in the proper method. The third step was to join the method and the man, proving Taylor's point by increasing earnings. In the end, effective division of labor was supposed to lead to the fourth step, in which the manager becomes a work "planner."


After about 20 years, the focus of Taylor's theory shifted to administrative organization. The firm, he said, was a giant machine, with ideal designs for its structure, the span of supervision, and the classification of various management activities. By the early 1950s, R.C. Davis had integrated the work of earlier theorists into a consistent approach to administration. According to Davis, organizations were abstract legal entities, formed and directed by a rational system of rules and authority. Davis described managers variously as planners, organizers, and controllers. For Davis and other classical theorists, the glue holding institutions together was a mixture of authority, responsibility, and accountability. Departments based on the product, customers, or geographic areas might divide organizations horizontally, but authority was always vertical. The flow of power always came from the top.


Such a theory had some obvious limitations, which were eventually pointed out by behavioral scientists. The classical model, for example, emphasized financial motives for work that weren't always relevant. In addition, it failed to consider other variables or to discuss problems such as motivation, choice, and informal relations. Clearly, the assumptions of the theory were either incomplete or wrong.


According to the traditional view, human organizations were really no more than machines. If that was actually true, however, the leader of an organization would only be limited by the constraints imposed by the capacities, speeds, durabilities, and costs of the machine and its parts – the employees. Clearly, there was more to running an organization than that.


Thus, in the mid-1950s, a group of so-called "radical" theorists was able to discredit classical theory and replace it with their own theory of organizational equilibrium. Working at Carnegie Tech's Center for the Advanced Study in the Behavioral Sciences, two leading "radicals," James March and Herbert Simon, launched their new movement with a devastating critique. Classical theory was obsolete, they said, because it made "severe assumptions about the environment of individuals in an organization, the impact of environment upon them and their responses to it."


Traditional theory had defined the environment as a well-detailed stimulus or system of stimuli. Each order supposedly provoked a predictable psychological form, and each form included a program for generating a specified response that was appropriate to the original stimulus. The idea was that organizations have a limited number of response programs, a stimulus for each program, and a response for each stimulus. But this mechanistic view actually produced outcomes that weren't anticipated by the theorists: associations unrelated to the task at hand were often evoked, unplanned stimuli were sometimes provided, and some stimuli failed to provoke the appropriate form.


March and Simon looked at the direction and control of large bureaucracies, morale problems, and the relationship between morale and productivity. What they found was that the demands for control made by many top managers, combined with their emphasis on "reliable" behavior, often resulted in rigidity. In addition, delegation of authority in a "classically" managed organization tended to increase conflict among departments. Although the use of impersonal rules might decrease the level of tension, it also affected behavior negatively and thus increased the gap between the organization's goals and the actual achievements.


Their conclusion was that neither classic theory nor 1930s "human relations" variants which considered morale were working. In their ground-breaking book, Organizations, they charged that the classical approach ignored the wide-range of roles that most workers performed. "It should be obvious," they wrote, "that supervisory actions based on the naive 'machine' model will result in behavior that the organization wishes to avoid."


Having established that scientific management and "human relations" approaches had failed, these behavioral theorists proceeded to offer their conditions for the survival of organizations. Their object, they claimed, was to replace "fancy with fact in understanding the human mind and human behavior," and to focus on the qualities of workers as "rational men."


Next: The Age of Adaptability


The Creative Also Destroys

Deconstructing Leadership

Anatomy of Insecurity

Managers and Their Tools

The Corporate Way of Life

The Dictatorship of Time

Friday, August 14, 2009

The Dictatorship of Time

Part 6 of Prisoners of the Real

The use of powerful tools has not only increased productivity; it has changed humanity's fundamental relationship to time. Since the renaissance, when the modern relationship between time and money was established, time has been "spent" in order to increase power. But in attempting to move faster and live by the clock, we have fallen out of phase with our environment, and further escalation will only deepen the disharmony. Replacing the biological clock with the mechanical clock, in a vain attempt to stay "connected" with the natural world, has proved to be a fatal mistake.


In the 18th century a new definition was added to humanity's concept of time: the "amount of time" worked under a specific contract, or, in other words, "pay equivalent to the period worked." This idea, initially connected to naval service, coincided with time's emerging relationship to the Protestant ethic. The refusal of Protestants to recognize the saints, a major step in the secularizing of life, also removed the 100 days that had previously been reserved for their celebration. These additional points in time were subsequently added to the work year as the West gradually began to sacralize work.

Early factory life experiences of the industrial revolution confirmed, however, that automation wasn’t likely to give us more control over time – even the leisure variety. The more likely prospect was that it would lead to increased regulation in the life of every person. Humanity had tied life to the oscillatory processes of work, and both were now controlled by the clock.

Even today the image of expanded choices remains illusory. Most people experience days of "non-work," periods crowded with maintenance tasks and largely lacking in creative meaning. Such times are little more than interludes between periods of work. Life is synchronized by an artificial time known as the "clock," which in turn is synchronized by work. Thus, human beings are synchronized to work, rather than technology being synchronized to humanity.

And what of work's creative meaning? The clock's presence in almost every part of life, along with the necessity of fixing and regulating events in time sequences, has led to the belief that what one sells in a work relationship is time as well as – and often instead of – skills. And a related impression: that we sell time rather than labor. Qualitative measures, sometimes described as performance standards, may be propped up by incentives or roped to security needs, yet they are mainly superfluous. When performance is rewarded, with the expectation that such reinforcement will keep quality high enough, the worker is still being paid for time.

Time has thus become the definer of our lives, often separating us from the consequences of our acts.

As psychologist Thomas Szasz wrote, "Men are not rewarded or punished for what they do, but rather for how their acts are defined. This is why men are more interested in better justifying themselves than in better behaving themselves." The implications of his remark stretch to the world of work and the systemic health of organizations. Through the clock, work has come to be defined as time rather than activity. As a result, it has become far easier to deny responsibility for results, or even intentions. The important thing, after all, is that we've "put in our time." In so doing and thinking, we restrict our relations with our environment, putting out but not taking much in.

Anyone who has worked in a civil service bureaucracy knows, for example, that the real boss is the time clock. The average day is a series of short, fragmented work periods, interrupted by standardized breaks. The concentrated effort that can result from freely chosen, extended work periods is virtually outlawed. Instead, atomization of work time minimizes results. In addition, time can place serious constraints on individual initiative, even assigning it an unstated negative value. And once initiative is gone, can commitment be far behind?

Even when there is no time clock to punch, other arbitrary standards linked to time can undermine effort and promote robotic response. In many bureaucracies and businesses, high visibility can serve as a substitute for actually accomplishing anything. Not only does the elevation of visibility to a standard of performance leave the impression that "being seen" is as important as effort itself, it also represents a subtle rejection of the principle that goals can be reached from different starting points and in different ways. In system theory, this is known as "equifinality," a proposition adopted from observations of biological phenomena.

The German biologist and philosopher Hans Adolf Driesch first encountered the principle of equfinality in experiments on the embryos of sea urchins in early development. A normal sea urchin, he discovered, can develop from a complete ovum, from two halves of a divided ovum, or from the fusion of two whole ova. Driesch concluded that the basis of life is a non-mechanistic vital agency, a soul-like factor that governs processes in foresight of goals. Borrowing from Aristotle, he called this factor "entelechy," claimed that it contradicted basic ideas of physics, and subsequently became a vitalist. His own observations had led him to the belief that vital phenomena are inexplicable in terms of natural science.

One does not have to become a vitalist, however, to see that visibility and other time-related standards undermine creative impulses that often produce the best and most original work.

Similar to visibility, filler – a term commonly used in various media – is activity designed to stretch the time worked. When an unsupervised employee, for example, is expected to meet a standard work time requirement, he or she may go to great lengths to pad the record. One former colleague, faced with the challenge of filling 40 hours a week, would take long drives to meet with people and claim his travel time as part of his work. Most of the business discussed on these excursions might just as easily have been handled over the phone. And certainly, the money he received as reimbursement for his travel could have been put to better use. Because he was responsible for filling time, however, he could avoid the ethical questions. Complying with the clock's requirements, he had retreated from his consciousness of freedom. As Jean-Paul Sartre noted, this recognition is too often denied.

Most people flee from freedom, said Sartre, instead "taking refuge in the 'serious world.' This is the realm of convention where one unquestioningly accepts as absolutes the prevailing values of the group in which one finds oneself." In responding to the clock, too many of us deny the existence and demands of freedom, replacing responsibility with filler.

As these examples suggest, the elevation of time as the crucial criterion of accountability has in many respects undermined important organizational values, including that sacred cow known as efficiency. Ironically, time and motion study experts, behavioral engineers and operations researchers have repressed the human capacity to find the way to accomplish tasks that best suits individual physical structures and psychological make ups. In fact, most continue to insist that for every job there is a single best way, the "most efficient" in terms of time and effort. Their denial of the individual's ability to make wise choices is much like an anthropologist who tries to teach native people how to save steps in a traditional dance.

Objective researchers have also failed to calculate their impacts on the objects of their inquiries. When workers internalize that their performance is viewed through the screen of minutes and hours, a distorted definition of work begins to evolve. Eventually, they will come to see their work as passive, largely reactive presence on the job, and conclude that both ends and means are beyond their range of concerns.

In his novel, Breakfast of Champions, Kurt Vonnegut posed the question: "To what extent are human beings sacred, and to what extent are they machines?" The extreme answer given by his hero, car dealer Dwayne Hoover, is reminiscent of Descartes, an arch-rationalist who felt that all creatures except human beings were automata:


"Of all the creatures in the universe, only Dwayne was thinking and feeling and worrying and planning and so on. Nobody else knew what pain was. Nobody else had any choices to make. Everybody was a fully automated machine."


In a rationally-managed world, organized on the base of linear time, it is far too easy to expand such a definition even further, accepting the idea that it is work and not humanity that is sacred, and that we too have joined the ranks of the fully automated machines.


Next:
Rules for Rationals


Previous:

The Creative Also Destroys

Deconstructing Leadership

Anatomy of Insecurity

Managers and Their Tools

The Corporate Way of Life

Monday, August 10, 2009

The Corporate Way of Life

Part 5 of Prisoners of the Real

The use of negative power by both leaders and followers helps to perpetuate the centralization of control in manipulative tools. Professional imperialism – that is, rule by kings called doctors, lawyers, scientists, psychologists, and so on – tends to create a society of clients and a climate in which conditioning extends the dominant values to the mass level. In the end, everyone falls into the hands of inhuman mechanisms, tools called bureaucracies and technical systems.

Through its professional elite, for example, the American Psychological Association helped to expand the power of industrial capitalism, using everything from "morale" improvement techniques to counter-insurgency work and military psychology. Its liberal wing, the Society for the Psychological Study of Social Issues, founded in 1945, “solved” social problems by making schools, factories and wars more efficient. Its central approach was rational psychology, a powerful social science tool that served well the needs of ruling elites, with methods such as separation of subject and object, attention to external and measurable behavior, and reliance on form. All of these are related to the need to manipulate, control and predict behavior.

The paraprofessional movement, lauded as a way to provide access for the non-traditionally educated, ironically marked the democratization of professional values. After all, these "new" professionals also had to embrace achievement, analysis and isolation as major rationales for action. In the struggle for their own piece of negative power, these offshoots of various professional elites gradually changed their attitudes toward their clients.

Before long paraprofessionals, having internalized values such as individualism, deferment of gratification and rationality, began to view dealings with the lower classes as an obstacle to their professional advancement. As striving professionals, actively engaged in efforts to improve their status, they often became more negative in their orientation toward lower class or impoverished clients than their more traditional colleagues.

The impact of education, another institutional tool that became manipulative upon passing its second watershed, was far more pervasive that its ability to regiment youth and propagate the ethic of achievement. Schools habituate young people to bureaucratic discipline, sorting them through standardized exams, selecting some for careers in the leadership elite and channeling the rest into the labor pool. As historian William Irwin Thompson defines it, the socializing effect of schools, on both faculty and students, is "to teach people how to live in a large public corporation."

Universities meanwhile became expansive educational bureaucracies, exploding into the communities that surrounded them and becoming dominant social, economic and political factors. These centers were linked by a common culture, advanced technical hardware, and a shared affection for information management. In “knowledge societies,” the computer has gradually replaced the museum. After leaving the university archipelago, Thompson noted that what he described as socialized education was indicative of humanity's increasing acquiescence to corporate routines. Big business, big government and big schools, he concluded, are all the same:

"The role of an educational bureaucracy is to educate people to bureaucracy, and this can be done as well in a course in humanities as in one in business administration. If one controls the structure, he can afford to allow a liberal amount of play in the content. The more alien subjects like mysticism, revolution, or sexuality can be brought into the structures of curricular behavior determined by educational management, the more these structures prove their power over just those areas of experience that might subvert them."

Thus, teachers serve as "managers of learning systems" who run committees called classes and prepare students for institutional life. Public schools become wards for the distribution of tranquilizers and behavior modification, or agencies of "manpower selection," part of the control apparatus that includes social welfare offices, mental health centers, and juvenile courts. Before we know it, the muscles of democracy become the closing fingers of the long arm of the state.

Eventually, Thompson walked away from the university world. I say he "walked" to stress that the relationship between teachers and the corporate school is similar to the connection between cars and the corporate tool of highway transportation. While education spreads the ethics of bureaucratic life, transportation systems foster addiction to speed. By now we are well aware of the environmental costs of this addiction. But as the evolution of modern transportation in many less developed countries indicates, speed addiction is also a powerful form of social control.

Before the building of highways, rough trucks in underdeveloped countries traveled dirt roads, carrying people and their animals to market together. After high-speed transport commenced, however, the old ramp trucks that had connected the major centers were exiled to mountains and swamps. No longer could people and animals make the same trip. Meanwhile, the taxes paid by peasants to finance such development flowed into the coffers of specialized monopolies. The average person paid for a loss of mobility without gaining new freedom.

Mass transportation is the popular technological solution to this problem: public ownership of high-speed machines. Yet mass transit cannot re-unite the traveling peasant with his pigs, nor reduce the stratification of society on the basis of speed. Pushcarts and bicycles would be far more effective in this regard. But tools like transportation systems are designed to deliver energy in certain minimum quantities. Less than the minimum cannot be managed, whether the issue is speed of transport, schooling, or medical care. In almost every case, the tools are out of the reach of most people.

According to Ivan Illich, important positions in government and industry are reserved for "certified consumers of high quanta of schooling." And to run the show, they need cars in order to rush to meetings. "Productivity," he concludes, "demands the output of packaged quanta of institutionally defined values, and productive management demands the access of an individual to all these packages at once."

Clearly, productivity today demands both speed and efficiency, which commit managers to quantitative measurement and condemn workers to mechanical regularity. With the introduction of machine tools standardization and interchangability became possible. These led in turn to mass production and the assembly line – the devotion to repetitive order. The modern employee is thus no longer a creative worker; he operates one or more machines. Meanwhile, scientific researchers seek more fully refined routines in the name of public service.

This "rational" approach promotes continued division of operations and tools themselves, as well as increased efficiency in support of the industrial mode of work. Independent art and expressive craft have clearly been among the casualties. Rather than sheltering these vital fields of activity, corporate and government subsidies have professionalized them, resulting in craft elites and art bureaucracies in which administrative rules eclipse aesthetic reflection. More damaging still, art has edged toward conformity, while its purpose has shifted from expression to utility. In the 1960s many craftspeople asked for recognition as bona fide members of the artistic community. Rather than enhancing craft, however, this movement helped to promote a functional ethic among artists.

Government funds in support of "community arts" increasingly had economic and social control motives. The artisans invited to "coordinate" the activities of students, prison inmates and mental patients were also instructed to speed their adjustment to social norms. Although the rebirth of cottage industries had positive environmental consequences, it also spread the values the efficiency and utility. Individual expression began to look somewhat superfluous.

The community artist uses his work to increase the involvement of people in productive activities. However, these activities are defined and regulated by impersonal state and federal structures that too often focus attention on ephemeral issues, meanwhile turning artists into the diffusers of rational values. As a result, art too has become a manipulative tool, simultaneously placing the power of expression into people's hands and turning artists into bureaucrats. Administering creativity in committee-style groups, they allow it to be used as another technique of institutional control. In the end, instead of artists, most of their students become adjusted technicians of paint, fiber, stone and wood, and a potential model of conviviality serves as another mechanism of dependence and exploitation, an opiate that extends the reach of rationalism.

Next: Dictatorship of Time


Previous:

The Creative Also Destroys

Deconstructing Leadership

Anatomy of Insecurity

Managers and Their Tools