How to Increase Your Computer Efficiency With Shortcuts

Everyone knows the scene. The computer hacker, often with a ridiculous time constraint, is tasked with breaking into a virtual environment or stopping others from breaking into theirs. Often typing streams of green text, their fingers caress the keyboard in a seamless blur, putting thought into action and controlling the computer with simultaneous grace and speed. 

You might not be able to do this, but you still can be more efficient when you use a computer.

This is not reality. Computer programmers and hackers do not take seconds to hack into defended systems on the fly. Writing sophisticated code takes time and effort.

However, efficient programmers, software engineers, and hackers are still much more adept at manipulating their computers than the average person. Watching them can be reminiscent of the movie scene with the hacker. Screens shifting here and there. Text appearing, being highlighted, and moved around rapidly. 

Many of the lessons they follow are not exclusive to programming. Rather, anyone who wants to be able to more quickly plant their ideas onto the screen can and should take the time to learn how to become more productive on the computer. 

The following are three tips that I have learned over the past few years to decrease the amount of time I take doing menial tasks on computers and more effectively plant my ideas onto the screen. 

Some of you may already know these tips, but even if you do, it is doubtful you use them to their maximum effectiveness. In line with my previous article on personal productivity, I wanted to provide additional ideas to improve it. I hope they will help you.

  1. Shift your mindset around computer use

The first step to increasing computer efficiency is to shift your mindset around how to use a computer. Your goal should be to reduce the amount of time you spend using your mouse and shift that time towards the keyboard, and to shortcuts specifically. 

When you learn how to accomplish a computer task with a shortcut rather than a mouse, you significantly reduce the input time between a thought you have and an output. 

Usually the steps are as follows: have a thought about what you want to do on the computer, start moving the cursor over to a part of the screen associated with that, finish moving the cursor across the screen (sometimes after over or under shooting the place you wanted it go), click or double-click, repeat step 1. 

If you know a keyboard shortcut, you replace the time spent moving your cursor across the screen with a rapid keyboard input.

While there is a learning curve (and it will take time upfront to learn the shortcuts), combined, the millions of times you might end up using shortcuts will save countless hours in the long run (8 days per year by one estimate)! 

  1. Memorize your shortcuts (and prioritize the important ones)

Once you convince yourself of the merits of using your mouse less, the obvious next step is to learn the shortcuts to all the tasks you want to keep on doing. 

To do this, you should prioritize learning shortcuts by their relative importance to your daily tasks on the computer. 

For example, if you spend a lot of time writing text, learn ctrl-left/right arrow and ctrl-shift-left/right arrow (option instead of ctrl on mac). If you find yourself taking a lot of screenshots, learn windows-shift-s (cmd-shift-4 on Mac). And no matter what you do, you should be using alt-tab and ctrl-c, x, v.

Below is a list of the most important shortcuts I use. 

*Note the use of the shift key in shortcuts: making it either go the opposite direction it otherwise would (1) or highlight text in a document (2). 

E.g. (1): alt-tab vs alt-shift-tab (cmd-tab / cmd-shift-tab on macs)

E.g. (2): ctrl-shift-left arrow selects the word to the left, while ctrl-shift-right arrow selects the word to the right (option-shift-left arrow and option-shift-right arrow on macs)

General Computer UseWindows ShortcutMac Shortcut
Undo / RedoCtrl-z / ctrl-yCmd-z / cmd-y
Switch applications (cycle to the right / left)
Alt-tab / alt-shift-tabCmd-tab / cmd-shift-tab
Find specific text on the page Ctrl-fCmd-f
Take a picture of a part of your screenWindows-shift-sCmd-shift-4
Open the task managerCtrl-shift-escape Cmd-option-escape
Browser ShortcutsWindows ShortcutMac Shortcut
Go to the next / previous submission form in browsersTab / shift-tabTab / shift-tab
Switch browser tabs (to the right / left)Ctrl-tab / ctrl-shift-tabCtrl-tab / ctrl-shift-tab
Document ShortcutsWindows ShortcutMac Shortcut
Copy / Cut / PasteCtrl-c / ctrl-x / ctrl-vCmd-c / cmd-x / cmd-v
Move cursor to previous / next word  (While typing)Ctrl-left arrow / ctrl-right arrowOption-left arrow / Option-right arrow
Select previous / next wordCtrl-shift-left arrow / ctrl-shift-right arrowOption-shift-left arrow / option-shift-right arrow
Move to beginning of previous / next paragraph Ctrl-up arrow / ctrl-down arrowOption-up arrow / Option-down arrow
Select previous / next paragraph Ctrl-shift-up arrow / ctrl-shift-down arrowOption-shift-up arrow / option-shift-down arrow
  1. Practice, practice, practice 

Regardless of what shortcuts you memorize, their ultimate usefulness will emerge only when you integrate them seamlessly into your everyday keyboard use. To do this, you must actually practice them. Imagine their use case when you first see and learn them. As quickly as you can after that, start applying them before you forget. Like a new word, a shortcut can become an integral part of your vocabulary or forgotten forever. It just depends on how frequently you use it. 

So, in the moments after you learn the shortcut, it is therefore crucial that you actually try it out for yourself. You can do some practice here. Even if it initially feels like it’s out of your way, I promise it will be worth it! 

Are there any I missed? If so, please comment below or send me an email at alexanderpasch@gmail.com.

Notes on Utilitarianism by John Stuart Mill

Some Useful Terms

  • Ethics is the subfield of philosophy concerning the nature of right and wrong. 
  • Normative ethics is the subfield of Ethics concerning what standards to use when judging what we morally ought to do.
  • Consequentialism is a normative ethical theory that judges the rightness or wrongness of actions entirely on their consequences or effects.
  • Utilitarianism is a type of consequentialism that believes happiness and unhappiness ought to be maximized and minimized respectively.

Book Notes

Utilitarianism (1861) is the most famous book on the eponymous ethical theory. Due to its great influence on the study of ethics and short length of just under 100 pages (allowing for its continual use in undergraduate classrooms), it has maintained great relevance to the present day. It has played a key role in the history of consequentialist ethical theories and can be credited, in part, for their popularity.

Divided into five chapters, Mill describes what the theory of Utilitarianism is (and is not), how people might be motivated by it, his proof for it, and ends with an analysis of justice and its relationship with the theory. Throughout the first three chapters it is notable how much time Mill spends deflecting canards, or objections he does not consider to have merit due to their inaccurate assessment of what Utilitarianism is – some occurring before he even provides an outline of the theory itself.

This outline begins in chapter 2. The key principle Mill directs us to, is the much remarked upon Happiness Principle : Acts are right insofar as they tend to increase the overall happiness or decrease unhappiness. By happiness, Mill is eager to point out, he does not mean the trite notion of momentary bliss, but all the aspects of life that are satisfying or pleasurable. Additionally, in discerning what types of happiness are best, he uses a controversial criterion: of any pair of actions where everyone or nearly everyone who has tried both prefers one over the other, the preferred one is the one bringing greater happiness. 

This, Mill believes, demonstrates that so-called higher pleasures of mental, moral, or aesthetic quality are better than lower, sensation-driven pleasures. It also leaves philosophizing and intellectual thinking as some of the greatest pleasures around — quite convenient for Mill, given that this is what he spent much of his time doing outside of political advocacy.

Furthermore, Mill notes, other principles often embedded in moral language, such as veracity or virtue, still have purchase in Utilitarianism. However, these are secondary principles, which, while good guideposts to moral behavior, are not the ultimate deciding factors of right and wrong. The ultimate judge of rightness and wrongness is the degree to which happiness has been increased or decreased.

In chapter 3, Mill dedicates significant time to describing how Utilitarianism is not unique from most ethical theories in certain ways. The same psychological and social sanctions will be used to prompt people to perform moral actions. While it may take time before the tenets of Utilitarianism seep out into society through education and persuasion, the mental and social tools to prompt moral behavior are already there, even if what is considered moral is changing.

In chapter 4, we are asked to consider how Utilitarianism might be proved. As he notes, this is no direct proof but is the best that can be asked for a moral theory. Roughly it goes:

  1. Everyone desires happiness
  2. The only way to prove what is desirable is to observe what people desire
  3. A person’s happiness is thus good for that person
  4. Therefore, the general happiness is good to the aggregate of people

Despite being warned that this was not a direct proof of mathematical strength, it does still feel underwhelming. Specifically, it is peculiar that Mill thinks it logical that one person’s happiness being good for them entails increasing the aggregate amount of happiness being good for the aggregate of people. Such a logical connection requires some other assumptions about what the aggregate of persons means and whether or not something can be good for them. 

Through chapter 5, Mill considers the topic of justice. He searches for common attributes to conceptions of justice, and finds them to be grounded in a set of emotions that deal with self-preservation, some observed in other animals. These emotions, when constrained by social custom, motivate the creation of law. Mill points out, the etymology of justice demonstrates the deep connection it has to our legal foundations (Jus means law in Latin). But, he states that it is deeper than law, as the law itself can be unjust.

So, justice can be seen as the ways in which society protects our moral rights, sometimes through law. This means we all have a stake in the creation of just systems. Mill connects this to utility, and the happiness principle, by noting that just systems secure people’s basic security and alleviate many of the most basic concerns we have regarding harms that others might inflict upon us. However, justice is not systematic and it lies on top of some of our deeper intuitions concerning morality. At the base of our moral intuitions lies the notion of utility. Justice emerges from this. Furthermore, Mill argues, there are cases in which it would be moral to act expediently outside of what is just, yet within what is moral. This demonstrates that justice delineates a class of moral rules which emerge in societies to satisfy certain common emotions used for self-protection and fairness. This is less fundamental than the notion of what is moral, which Mills states is determined by the principle of utility.

Together, the chapters lay out a series of passages that contain many influential and compelling arguments in favor of, at the very least, a prioritization of happiness in any ethical system, if not adherence to Mill’s version of Utilitarianism itself. Mill’s work has been followed by a series of derivative ethical theories and has done much to advance the expanding moral circle, where greater moral concern is given to women, the impoverished, those in other countries, and non-human animals.

How I (Try to) Stay Productive: A List of Tips

Jump to the list of tips

As I wrote about last week, the internet age has given us countless devices and apps designed to distract. It still is sometimes hard to distinguish where exactly an activity transforms itself from useful, or harmlessly entertaining, to full on distracting. However, what seems clear to me, is that for most people using modern technology, this line is often crossed. Given this, I thought that it might be useful to write about what I do to prevent distraction and increase productivity in my life. 

I remember when I first started realizing that technology was going to be a serious problem for my academic prospects. In middle school, I often would procrastinate writing papers until the middle of the night before they were due. At this time I did not have a computer of my own, so in some sense I thought it a treat that I was able to use the family computer on a weeknight. I would usually end up watching Netflix until my monkey brain finally ceded control sometime in the wee hours of the morning and I began writing. 

Reflecting upon those experiences at the time, I knew it was a problem. I knew it was going to make it more difficult to succeed come high-school, but I didn’t have a ready blue-print to deal with the problem. I was also too confident in my own capacity for self-restraint to seriously ask for help.

Since then, I have gone through numerous strategies to help curtail the negative influence of technology in my life. Today, I rely on a combination of certain habits and certain restrictions on sites. The following are, I believe, the most important features of my current system.

I have a set-up where I keep my computer in the place where I do most of my work. I, with almost no exceptions, keep this laptop there and do not bring it to where I sleep and do much of my reading. I bring my phone with me, but try to keep it apart from where I am sleeping (or at least on the opposite side of the room if I need it for an alarm). I am still working on improving my phone habit, however.

In terms of technical steps to prevent distraction, I found a few important apps and features in iOS 14.3 that are particularly useful. For my computer running Windows 10, I use an app called Cold Turkey to block every website on a list across every browser. It works by forcing you to install the Cold Turkey extension on each of your browsers in order for them to launch. You can then make schedules both for when websites on this list are blocked, and when you are able to edit this list. I have found this to be very effective at preventing me from accessing certain sites (eg. Youtube, Reddit, and News sites) that I would gravitate to when bored and get sucked into. 

On my phone I have a rather draconian system. The first thing I use is the inbuilt Content Restrictions settings in the iPhone settings. Here I only allow access to sites that I have whitelisted. These are Wikipedia, Google, and a handful of others. I have purposefully forgotten the content restrictions passcode so that I would need to reset it with my AppleID to change these settings. 

This doesn’t work by itself for me because this doesn’t prevent you from installing apps that you can use to easily evade the content restrictions. In response I have deleted all apps that are somewhat distracting and purposefully forgotten my AppleID password so it is more difficult to reinstall them. Hopefully, in the future Apple makes it easier to self regulate your usage and harder to bypass your restrictions. I know my complicated setup isn’t for everyone, but you can still use the productivity features offered in iOS in less extreme forms.

Here is a summary of my productivity tips:

Habits and Home Setup Tips

  • Separate your workspace from your sleeping and resting space.
  • Keep your computer in a different room from where you sleep.
  • Charge your phone in a different room (or at least the opposite side of the room) from where you sleep and where you work.
  • When reading, keep devices in a different room or put them where you can’t hear them.
  • Set time in your day when you can’t use the internet, particularly at night

Tech Tips

  • Use Cold Turkey (or Self-Control for Mac) to block sites or apps that you think are distracting on your PC. Or, only allow yourself to use certain apps at specific times.
  • Use Content Restrictions on iOS to either block all non-whitelisted sites, or block specific sites you find distracting.
  • Turn off notifications from apps that you use too much.
  • Delete apps that you can’t stop using or can’t stop from distracting yourself.
  • If you need a draconian measure, forget your passcodes and passwords that allow you to change these settings or download distracting apps.

Most importantly, I’ve found that this is a continuous process. You will not find the perfect setup for yourself immediately. The most important thing is that you don’t give up. Instead, accept incremental progress as you learn more about yourself and your habits.

Good Luck,

Alexander Pasch

Bottlenecks to Progress in the Internet Age

I have been reading A New History of Western Philosophy by Anthony Kenny and it resurfaced thoughts that I have often had when learning about historical figures and everyday life in prior eras. In particular, how these figures were able to overcome the dual problems of censorship of political and religious elites and the limited availability of information will always fascinate me.

The lack of access to crucial historical texts was perhaps the major bottleneck which prevented philosophical progress in medieval Europe. In fact the capture of Constantinople by Ottoman forces in 1453 ended up being critical for the Renaissance. This is what forced the Greek scholars, who had kept the philosophy of Plato and other ancients alive, to flee to Italy, where Scholasticism (the rigid fusion of Christianity and Aristotelianism) dominated. The spark of new classics was enough to light the flames of new philosophies that burned the Scholastic tradition to the ground.

Think about that. Works of Plato, lingering somewhere in Byzantine libraries for hundreds of years, simply needed to be transported across the Meditteranean and communicated by the scholars who kept them to unleash a wave of progress the world is still reverberating from. Obviously there were many factors behind the Renaissance, but it is a remarkable feature of this time that a relatively small set of books could cause such massive intellectual changes. In part, this is because there simply wasn’t that much new stuff to read. Something coming out was a big deal. Even if it was a re-release. In fact, it wasn’t really until the 19th Century that it became impossible to read everything worth reading in most subjects.

Beyond the scarcity of written material, religious and political persecution has been another persistent feature of the Western Philosophical tradition’s opposition to progress. The political turmoil in the lives of almost every major Medieval and pre-Modern philosopher is striking. Each writer had to self-censor, and in many cases were forced to flee or outright killed. To name a handful:

Boethius (tortured and killed by the Ostrogothic King Theodoric)

Giordano Bruno (denounced and burned at the stake in Rome)

Baruch Spinoza (excommunicated and exiled from the Jewish community in the Netherlands)

John Locke (fled England to the Netherlands to avoid political persecution before returning)

In stark contrast to this is the extraordinary availability of information today and the ease with which new ideas can be articulated. This is perhaps the most remarkable fact about our era (and what makes you reading this possible at all). It also opens the question, why, since the invention and wide scale adoption of the internet, productivity and economic growth haven’t sped up more? One theory (articulated by Tyler Cohen), is that we have already taken much of the low hanging fruit that yielded the massive economic progress of the 1900s. Science, likewise, is using more people to make less progress than it did in the past. 

If this is true, then it seems that we hit a sweet spot for GDP growth and scientific progress somewhere in the 20th Century. Our intellectual and political climates were just good enough to unleash discoveries and inventions just out of reach of previous generations, but much easier to find than those to follow.

On the personal side, it might be hard to relate to GDP figures. But the relationship between personal productivity and economic productivity is a topic that still sometimes crosses my mind (despite how differently they may be defined). For myself, having been born in an age and place where the internet was nearly ubiquitous, and my capacity for distraction by it nearly endless, I wonder what its overall effect on our productivity has been.

On the one hand, learning has been unquestionably easier. Writing papers often includes of cycles of: typing, opening a new tab, searching Google, finding crucial information, and switching back to type my findings and analysis mere seconds later. This would have taken orders of magnitude longer in the pre-internet age but is now a seamless feature of student and writer’s lives. Educational content producers and random helpful figures on the internet are easily found and often filtered by how useful their information is. Finally, Wikipedia (which yesterday turned 20!) is always there to provide an overview on just about anything.

But that is helpful only when I am working. An expression which I have found most apt in describing my personal productive capacity is Parkinson’s law: “work expands so as to fill the time available for its completion”. The shorter the deadline, the more productive I will be to finish it. A longer deadline gives me time to slack off and fuel procrastination. And while procrastination has existed since the day man began working, the magnitude of its influence has grown larger than ever before.

The attractiveness of distractions has particularly grown as our attention has been commodified with a profit motive attached to our eyeballs. Devices and applications are extremely efficient, not at improving your overall well-being, but guiding your attention in whatever way software engineering teams see fit. This is a uniquely modern curse. 

To bring this back full circle, I must clarify that I would unquestionably submit to the current challenges of slowing growth and hyper-distraction rather than those of intellectual scarcity and persecution. We have traded away the incredibly cruel world of the past for good reason. 

However, we must think harder about the questions posed by the information age. How should one deal with the experience of information overload and the increasing complexity of decisions (particularly major life decisions)? How should we design our relationship with our technology to leave us well informed, more in control, and less distracted? How should we think about the economy and our role in it — particularly if much of the low-hanging fruit has been plucked, and humans (with the same brains and bodies) are demanded to jump higher than before in order to achieve the same GDP growth achieved in the past?

The curses of the past have been traded away for lesser, and in some ways opposite, curses of the present. Acknowledging them, and answering the questions they raise is something I will continue to attempt. Luckily, the internet has shown me that I am not alone.

Consciousness: What it is and why it matters

This is part one in a series on consciousness

I’ve desired for some time to begin writing about my view on philosophical topics in an approachable but serious manner. With the advent of a new year, I figured I would now begin publishing weekly posts in this vein, starting with a series of posts on consciousness. 

By consciousness, I mean something quite basic: the fact of experience or what it is like to be something (in Thomas Nagel’s sense). I do not mean self awareness, the capacity to reflect, to report, or remember. On the other hand, non-consciousness is simply the absence of consciousness. I do not use unconsciousness here, because it often relates to parts of the human mind that lie out of reach of consciousness, but I am talking about more than just us.

My rationale for writing about consciousness is two-fold. First, consciousness is in many ways foundational to everything we care about, especially in ethics, another topic of great interest to me. Understanding how widespread consciousness is, is crucial for developing our moral frameworks (lest we vivisect dogs again because we believe they are soulless) and general theories of the world. Second, I believe the current dominance of a physicalism (the belief that physical matter is the only fundamental substance that exists) that sees non-conscious things as the default is misguided. I find that this version of physicalism rests on shaky premises which I wish to attempt to investigate and challenge. 

That challenge is what I will begin in this post. The fact of positing that something is non-conscious at all is an odd endeavor. It involves using your own conscious states to try and demarcate what, outside your own mind, is not conscious. 

When you imagine a rock, or another entity you believe to be non-conscious, you are using your consciousness to imagine or sense it. You cannot go a further step to imagine its non-consciousness, for imagining entails consciousness of some sort. Instead, what appears to occur is that you are unable to utilize your theory of mind on such an object, and perhaps with the assistance of other beliefs (consciousness requires a brain or information processing or movement) you then have the thought: “this rock is not conscious”. 

However, when you drill down on these thoughts, they become difficult to justify. The capacity to use your theory of mind does not determine whether or not any given person, animal, or object is conscious. We can imagine what a dead person would be thinking, while failing to imagine what a bat is feeling. Furthermore, investigating other beliefs about what is and isn’t consciousness often relies on the premise: things that are not sufficiently like me are not conscious. 

This is what I shall write about in next week’s post.

-Alexander Pasch

Consciousness: Where it might not be

This is a part two in a series on consciousness

Continuing from last week’s post, I shall explore avenues on how exactly one can doubt the consciousness of objects you encounter. Again, by consciousness I mean any type of experience something or someone might have; or what it is like to be something.

From the birthplace of modern philosophy, we have irrefutable reasons to say conscious stuff exists (from Descartes) within those thinking the sentences ‘I think therefore I am’. Beyond the odd solipsist, most everyone agrees that it is also reasonable to assume that other people are conscious as well. Today, we further assume that dogs and other mammals are conscious. What about trees? Grass? Rocks? The Sun?

I have found, for most of my life, an obvious answer to these sorts of questions. While the exact nature of what consciousness is remains mysterious, it was obvious to me that it was a product of the brain. The mind is what the brain does, to use a neuroscientific quip. Consciousness is something like information being processed, or a byproduct of a working functional system. 

Yet I began to doubt these answers as I considered the unity of nature — the fact that all things, including our bodies, are made of the same particles that stars are made out of; emergent from the same quantum fields. The trajectory of history also seemed to point in the direction of decreasing human distinctiveness (from Copernicus to Darwin to Goodall to AlphaGo), an expanding circle of moral worthiness, and a wider range of animals considered conscious.

We're All Stardust | Stardust Meme on ME.ME
Source

So I investigated the actual premises — the underlying reasons — for a belief in non-consciousness. A starting point is noticing that human consciousness is profoundly altered by changes in the brain. This was noticed as far back as Roman physician Galen, who wrote about how gladiators who suffered head injuries were permanently psychologically harmed. This presaged the connections between brain activity and conscious states that modern neuroscience has done much to uncover.

From here, it could be assumed that the requirements for consciousness to exist are found in certain properties the brain has (whether as an information processor or for the functional roles it plays in living organisms). After all, if the brain is harmed, or is sedated, you lose consciousness (or at least the memory of it). Every theory of consciousness therefore gets selected first by whether it explains the consciousness of those who can say that they are consciousness. Right now that’s just us humans.

But the problem is that you can’t boot a restrictive theory of consciousness off the ground without some additional assumption: that anything that isn’t sufficiently similar enough to us humans isn’t conscious at all. Otherwise, there is no way to disprove countervailing theories of consciousness that describe non-human objects as conscious.

If you want to say consciousness emerges when brains, or similarly complex objects are formed, I can come along and say, “yes that is one example of consciousness, but consciousness also occurs when only relatively simple objects are present.” You have to fall back on an intuition that things that are not similar enough to us are not conscious. No matter what restrictive theory you have to explain consciousness, there is no way to refute a wider theory of consciousness without that intuition.

The following argument articulates how this line of reasoning works:

1. I am conscious. 

2. I can sense many things that are not similar to me (or the body I consider mine).

3. Things that aren’t (sufficiently) similar to me are non-conscious.

4. Therefore there are many things that are non-conscious. 

The argument relies on the intuition present in premise 3 to be valid (as well as a vague notion of similarity). Yet, every theory that excludes consciousness to any subset of things similar to us relies on it. Where this intuition arises from is of interest to me.

In next week’s post, I will investigate how this intuition might itself be emergent from the physicalist worldview, creating a circular argument.

Consciousness: Why people think it might not be everywhere

This is a part three in a series on consciousness

Last week, I introduced the intuition that things “that are not (sufficiently) similar enough to us are not conscious.” This intuition matters because, without it, there is no way to ground a restrictive theory of consciousness. Put another way, without this intuition, you would find it impossible to defend the position that anything at all is non-conscious. It is present, whether explicitly or implicitly, in every restrictive explanation someone gives for why consciousness is or isn’t present somewhere.

One could argue that the intuition could be ignored by instead falling back on some other defining feature of consciousness. For example, if you believe processing information is necessary for consciousness to exist, you might instead think the phenomenology grounding this belief (consciousness simply is information processing) justifies it, and thus justifies consciousness being restricted to information processors. Ostensibly, falling back on this belief could remove the need to rely on this intuition described above. However, this falls apart when you look at the details.

For one, there is not one definition of information processing. It could be that everything in the universe is describable as an information processor (perhaps in the way a particle or an object enacts the laws of physics in order to interact with surrounding objects). But, this ends up being an entirely non-restrictive theory. 

To counter this, one might make the definition of information processing could be made more restrictive. However, for any restrictive definition of information processing, the phenomenological grounding breaks down. It is possible for me to see how my consciousness might be, in some loose sense, information being processed. But, it is very unclear phenomenologically, why any one restrictive definition of information processing is the one correct definition. Then, without phenomenology to explain the choice of any specific restrictive definition of information processing, one would have to, again, fall back on the intuition that things that aren’t sufficiently similar to us humans are non-conscious (as that type of information processing would happen to occur in our brains but not everywhere). 

This brings us to the question: where does this intuition come from? Why believe that anything we experience is non-conscious? I believe that it is a consequence of our current physicalist worldview. If the things in our environment move like clockwork, as physics tells us, they can be predicted without any mention of consciousness. In this case, the fact that we are conscious at all is something special that needs to be explained. This explanation ends up usually being a restrictive theory of consciousness (X in the diagram below). Because most things in the universe aren’t like you, you can then use this theory to explain why these things are, in fact, non-conscious. This can be used to justify the version of physicalism you began with (the one which explains the world without reference to consciousness). This however, creates a circular chain of justification.

In next week’s post, I will conclude my thoughts on consciousness by addressing some critiques and discussing why I think this topic is relevant in the first place. 

Consciousness: The relationship with the current physicalist worldview

This is part four in a series on consciousness

Last week, I discussed how to justify any restrictive theory of consciousness (that is, any theory which says consciousness is not universal). I concluded that even if you try to ground your restrictive theory in your own phenomenology (or first hand experience), you still cannot do so without holding the intuition: things that aren’t similar to you aren’t conscious. I shall call this the “similarity intuition,” or simply “the intuition” in this post.

Put in argument form, here is a way you might try to avoid relying on the intuition.

  1. Consciousness requires X
  2. X doesn’t occur in things not similar to me
  3. Therefore, things that aren’t similar to me aren’t conscious

Now you rely on (1) instead of the intuition. But you still need a way to believe X is required. This could be done phenomenologically.

  1. My consciousness has certain essential properties that I can discover phenomenologically
  2. These properties are essential to any other consciousness
  3. These conscious properties can be mapped on to certain properties X, which are present in certain physical systems 
  4. Therefore, if X isn’t present in something, it is non-conscious

This argument appears to sidestep the intuition, but relies on it nonetheless. First, in (2) it assumes that properties essential to your consciousness are present in any consciousness. In other words, all consciousness must be similar to your consciousness; at least in so far as it has certain properties. 

The similarity intuition is more clearly present in premise (3). Any restrictive mapping of phenomenological property to a physical or mathematical system will require an intrinsically self-centered approach. This is because it consists of humans mapping their experience to their brain states. In order to justify this mapping, one has to rely on the intuition that other less restrictive mappings don’t describe consciousness. In other words, things not sufficiently similar to me (where phenomenological states are mapped to a physical system dissimilar to me) are not conscious.

One could argue more easily against the second main claim I introduced in last week’s post. Here, I linked the current physicalist worldview to this similarity intuition in a circular, self-justifying relationship. One could argue that physicalism is compatible with panpsychism, an expansive view of consciousness that sometimes describes consciousness as a physical property common to all particles or physical systems.

Moreover, some might claim that physicalism needn’t weigh in on the debate over exactly where consciousness exists at all. Simply put, the more dissimilar a physical system is from a human being, the less we know about whether it is non-conscious or conscious. 

If this were all that people claimed I would have less of a problem. But most physicalists do not only argue that there isn’t epistemic justification for believing that things dissimilar to us are or are not conscious. They don’t sit in a state of agnosticism about this. They believe that such things are, in fact, non-conscious (eg. rocks, plants, waterfalls, etc.). 

My claim is that there is an obvious connection between the common scientific-physicalist worldview, conceptualizing the world as clockwork, and the belief that most of the world is non-conscious. Furthermore, the similarity intuition is both justified by this worldview, and helps maintain it.

Some actual clockwork

I want to say here that science continues to be the best way we have for explaining much of the world. In countless ways it has made our lives easier to live. But it is also true that the questions scientists are asking do not try to answer what I am talking about. They usually ignore consciousness, and for good reason. Treating things in the world as clockwork puts us in a frame of mind to start making hypotheses, mapping out relations between cause and effect, and making predictions. This is an eminently useful endeavor. 

But success in treating objects in the world as clockwork should not permanently cloud our judgements about whether, at the ground level, everything in the universe actually is determined. And it certainly should not prompt us to permanently believe that consciousness is present only in systems similar to us; at least not without proper justification.

My attempt in this series on (non-)consciousness was to push back against a common dogma and identify a common intuition justifying physicalism. I don’t know how many readers I have convinced of this, but I hope to have at least pushed the conversation forward a little bit.

Best,

Alexander Pasch