Showing posts with label Digital. Show all posts
Showing posts with label Digital. Show all posts

Wednesday, May 17, 2023

Disaggregation dilemma - Part 1...(Of GIS based PDFs and Water Soaked Blue-prints)


What is wrong with our land-use plans ?

Well...nothing. Except, perhaps, the fact that they belong to an earlier epoch of technological development - a period when one necessarily had to prepare maps at different spatial scales in order to show greater or lesser detail; and use specific colours to aggregate the primary land uses at different scales - for example, yellow for residential use; red for commercial use (depending on prevalent cartographic rules).

One can also say that the technology of land-use maps, as they continue to be used to urban planning in India, corresponds to the period of map making prior to the advent of computerised cartography and geo-spatial analysis.

Using present technology, we do not need to switch between different maps prepared at different scales to study different degrees of spatial detail. Instead, we can simply zoom in and out within the same map. 

In most aspects of our lives we take this for granted - when we are booking an uber; or checking directions to a destination on google maps; or checking how far the swiggy delivery partner is at a particular point of time. 

In all such businesses, computerised geo-spatial analysis and decision-making is not just one of the components to be considered -  it is the most fundamental science and technology on which the business operations play out.

However, in a vital and complex activity such as urban planning, whose social and economic significance far exceeds that of profit maximisation in the gig economy, such technology is still a sort of a novelty which is far from having been internalised by the rank and file of the profession.  

In fact, the inadequacy of technical knowledge becomes amply clear precisely when one takes a look at the manner in which the planning profession attempts to internalise geo-spatial technologies. I discussed this in an earlier blog.

It is perhaps too difficult for our planning professionals and educators - too busy flaunting tech-terms and buzzwords - to come to terms with the simple fact that if your planning maps are made using GIS software then you do not need separate sets of maps at the levels of the city - i.e. the city level, the zone level and the layout level -- they are all part of the same geo-spatial database ! 

I am not even getting into the travesty of making such "GIS" maps available online in PDF format and then providing the attribute data in separate spreadsheet files and THEN announcing this pointless hotch-potch as Open-data ! A tighter slap on the face of the open-data movement was never landed. This is not open-data...this is an open disdain of the citizen.

 

From GIS based PDFs to Water soaked Blueprints

Let's have a look at such maps as they are available from the website of the Delhi Development Authority -

a) Here is the "big honcho" - the proposed land-use map of all Delhi. The highest level of the plan and the one with the smallest geographical scale and level of detail. Most of the time lay-persons attempting an analysis of the Delhi Master Plan remain pre-occupied with this level. Of course, it shows nothing more than the most general and most aggregated land-use distribution at the level of the city.









b) The next level of planning detail comes in the form of Zone level land-use maps. Shown below is the map of Zone-F in South Delhi. As per the zonal plan report, already in 2001, this zone had an area of 11958 hectares (i.e. 119.5 square kilometres) and a population of 12,78,000. That basically means that while it is just a part of the city of Delhi, it is still larger than many smaller sized cities of India (it is, in fact, larger than the smart city of Bhubaneswar in terms of population). 

The Zone too, therefore, is at a substantially high level of aggregation and can be compared to city level land use plans of one-million plus cities in India.

(NOTE - pay attention to the key-map in the attachment below and marvel at the cartographic genius of whoever prepared this "GIS based" pdf output)











c) And something peculiar happens when we go down to the level of the layout that contains the maximum geographical detail - the layout plans; which are more like a plan for a cluster of neighbourhood blocks. 

Here is what the plan of one of the layouts constituting Zone-F looks like...if you can make anything out that is. The keen observer would realise that this is actually a well drafted layout map (at least the key map is correct !), but we have suddenly descended from the world of GIS based PDF map outputs, to the world of water-soaked and worn-out archives of crumpled gateway sheets and blueprints. 



This is what gets uploaded as digital layout maps on the website of the premier urban planning agency of the capital of the country. 

There is therefore a complete dissonance between what digital and geo-spatial technologies truly are and how they are being utilised. 

In this matter the critics and activists of the civil-society and consultants of the private sector are often more technically incompetent than government planners. The government officials may not be familiar with the modern software but they know their cartography well enough (as illustrated by the water-soaked map), while civil society critics and private sector consultants (who often actually prepare the "GIS" outputs) are often poor in both technology and cartography.

 

In the next part we will see how computerised geo-spatial methods eliminate the problems of aggregation by allowing data to be maintained at as disaggregated a level as allowed by its granularity and aggregating the base-data as per requirement to whatever level necessary processing power of the computer.

In the words of planning expert and theorist Poulicos Prastacos -

"Data should be maintained at the lowest level of disaggregation and then readily aggregated as the need arises."

(Source - 'Integrating GIS technology in urban transportation planning and modeling' - P. Prastacos)

To be continued...



Saturday, May 6, 2023

The Data exists...right under our Mouses !

The capital irony

It is perhaps a capital irony of our times that precisely at the time when computers are more powerful and affordable than ever before and the access to powerful and previously expensive software provided by the  Linux + FOSS (Free and Open Source Software) movement, the general ability to use computers effectively to address the various problems faced by our cities is at an all time low.

I myself come from a background of primarily qualitative and participatory techniques in urban planning. I continue to have a natural fondness for such techniques, but have increasingly also discovered the power that effective use of computers bring to my work.

Contrary to the myth that the quantitative and qualitative worlds are poles apart (which leads to the further myth that professionals dealing with qualitative techniques cannot use computers for serious quantitative analysis), the two are in fact friends and allies of each other and help each other continuously.

Without waxing complex, think of a rather simple example. I would like to undertake participatory exercises in various slums in my city and I use all kinds of creative ideas to undertake the same inside those communities.

But alongside that, I could also prepare a GIS database of the slums in the city that gives me spatial and quantitative information on slums - such as their location, distance from each other, distance from other city facilities, size and density of the settlements, the population and occupational characteristics of the settlements etc.

This quantitative database can actually help me increase the effectiveness of my qualitative techniques by helping me to schedule meetings, use different techniques in slums of different sizes and shapes, check the probability of consensus-building (fewer meetings could build consensus faster in a smaller slum than in a larger and denser slum) etc.

Rather than focusing too much on whether to deploy quantitative or qualitative methods, it is better to focus on the problem that needs to be solved and deploy whatever methods that may be necessary.

Why computers ?

As long as I want to do participatory activities in a handful of slums, I may not need any support of computers at all. However, if I would like to undertake such activities in tens or hundreds or thousands of slums, then I begin to feel the need of the processing power of the computer.

It is as simple as that.

As more and more resources are made available to various urban development programs and schemes in India, their sizes, duration and scale of operation are all increasing. It is not difficult to understand that in a country the size of India, urban development projects would need to be undertaken at a scale where one can at least hope to make a meaningful difference. 

But of course, computers need instructions to follow - and they need data to work on.

The data exists...right under our mouses

Quite often, the impossibility of obtaining data is cited as one of the main barriers to effective use of computers in solving urban problems in India. I have written on this topic on multiple occasions. And I have stressed on earlier blogs that mere accumulation of digital data is not of much use if one does know how to use computers effectively to process it.

However, another capital irony of our times is that much of the data whose absence we so lament - does indeed exist...and sometimes right under our noses (or mouses).

Let me demonstrate.

This particular link will take you to the dashboard of the "GIS based Master Plan" sub-scheme of AMRUT. 

The very first component of this sub-scheme was geo-database creation and in the following screen-shot of the dashboard we can see its status -

 



If we look at the first three steps of the component, we can see that satellite data had been acquired and processed for about 450 cities. In the pie chart on administrative works is not self-explanatory, but it could mean that the National Remote Sensing Centre (NRSC) of the Indian Space Research Organisation (ISRO) may have handled the satellite data acquisition and processing for 240 cities and private companies may have done it for another 220 cities.

In any case, according to the official dashboard itself, we can conclude that processed satellite data exist for about 450 cities. As per the status chart, final GIS maps also seem to exist for 351 cities.


From if it exists...to where it exists

Finding evidence and clear arguments for the claim that something exists, is the first step in finding something. If I know for sure that something exists, then I need not succumb to the fallacy that it doesn't even exist. 

The task after that is to discover, where it exists, rather than wonder if it exists.

The same method can be applied to understand exactly what all data has been collected and processed under the myriad central and state government schemes that are going on in the country and have already been executed in the past.

Believe me, we will have more data than we would need for getting most of our tasks done.

The catch here is this...a person who does not have the imagination to discover the data, most likely would not have the imagination to use that data either.

But let's keep that blast for a later post ;)

Monday, February 20, 2023

Regarding farcical flirting...and GIS based master plans

Earlier today, I wrote the following on my linkedin status -

GIS based Master Plans...it's a bit like saying pen based novels...or camera based photographs.
India's farcical flirting with technical terms has to stop.
It shows an unreflected acceptance of meaningless sentences - in other words, it helps accelerate collective stupidity.

I felt that this should be elaborated upon. 

To be sure, I am not against flirting - it is a creative art. Anyone who wishes to study it seriously could turn to Act 5, Scene 2 of William Shakespeare's play "Henry V". 

Here is a youtube link to the scene in the classic film adaptation by Sir Lawrence Oliver -

 

Of course, one is free to point out that strictly speaking Henry was wooing Lady Katherine and not flirting with her. 

But then I am equally free to reply, that while Henry was indeed wooing Katherine...he was also flirting - not with her - but with the just altered geo-political situation in Europe following the battle of Agincourt where Henry achieved decisive victory over the French and was in a position to dictate terms to her. 

Yes, flirting is art indeed - of geo-political scale and significance.

So much for my admiration for genuine (geo-political or not) flirting. 

But the recent tendency (increasing at an exponential rate) in India to flirt with "tech" terminology (including the ridiculous sounding word "tech") without any regard for what they really mean, can be considered farcical indeed.

Sentences beginning with the following should immediately put one's BS-filtering systems on high alert -

  • "We are developing digital tools for..."
  • "According to our AI based tools..."
  • "As per our machine learning algorithms..."
  • "We use high-resolution satellite imagery for..."
  • "We have adopted a data-driven approach for...."

The trouble is not with terms like digital, AI, machine learning, high-resolution satellite imagery etc., but with the things that generally appear in the second part of the sentence.

Consider the following statement - "We use high-resolution satellite imagery to track land-use violations in Bhubaneswar smart-city on a monthly basis."

If this video clip (with an animation showing a satellite "diving" from its orbit every time it tries to get a better look at Bhubaneswar and other such wonderful things) is not enough to turn you numb, consider the following questions -

  • How exactly does a high-resolution satellite imagery help me understand what use the buildings it shows are put to - commercial, residential etc. ?
  • How does it help me understand violations in bye-laws such as height restrictions ?
  • How exactly does it help me understand the prescribed use of the land on which the building is situated ?
 

 AND...

  • What about the fact that last I checked the master plan of Bhubaneswar was under revision the new one is not even out in the public yet ??

High-resolution satellite imagery is indeed extremely useful for city planning purposes, but let's just say -
there is many a slip 'twixt the cup and the lip.
 
In other words, there is a whole range of activities that have to be done, and done properly, before high-resolution satellite imagery can perform the task of tracking and preventing land-use violations.

By beginning the sentence with the 'cup' and ending it conveniently with the 'lip', without ever clearly mentioning what all lies in between betrays either complete ignorance of the processes involved or a sly ploy to shock-n-awe anyone not familiar with such technology for the sake of furthering ones agenda.

As regarding all that lies in between the cup (the technology) and the lip (its successful application), we can turn to Shakespeare again and say,

"Ay, there is the rub !"

It is precisely in all the activities that need to be done to make a technology effective, that the real professional grind lies - and that which most would like to steer clear of.

It is easy to write a software...but notoriously hard to organise and clean the data that would be used by the software; and fix the organisations that would use the software.

 

And now let's turn to the star of the show...one that really cracks me up every time I hear it - 

GIS based Master Planning.

I wonder who came up with that one, and most importantly - why ?

Do we ever say ludicrous things such as a pen-based novel; or a camera-based photograph; or a type-writer based article  etc ?

What exactly is the point of defining a planning process using one of the many tools, which may be deployed to aid its preparation, apart from either or both of the following -

    a) Zero understanding of planning.

    b) Zero understanding of the role of GIS in planning.

Well, there is a set of more realistic causes which are far more sinister than the above two - but let's go with these for the moment.

Applying the GIGO (garbage-in-garbage-out) model - which suggests that if the input (data) is garbage, then the output (solution) would be garbage too - to our present topic, we can argue that if the input (the planning approach) is meaningless, then the output (plans produced), would be meaningless too.

Thankfully, we have an elaborate dash-board available in the public domain to support our argument. 

As expected, the dash-board is a cool and elaborate one containing all kinds of information, except the most important one - the plans themselves. 

The first alarm bell rings when we scroll to the middle of the page and look at the master plan formulation status. We learn, that after 8 years of implementation, only 135 out of the total 500 cities covered by the scheme have reached the final step of "Final Master Plan". A total of 257 have reached the level of "Draft Master Plan".

But where are these amazing 135 GIS based Master Plans ?

For that one has to scroll right to the bottom of the page and click on the relevant states (or cities) on a map to access the plans. One has to repeat a few more rounds of needless clicking until one finally reaches the page from which two separate files can be downloaded - Master Plan Report and Land Use Map.

As Gujarat was proudly colored green (all tasks completed for all cities), I decided to click on a random city called Botad.

I didn't expect to hit jackpot on the first city I clicked on, but this is the Master Plan Report that popped up -

 


 
You can download the report directly from this link.

It was a pdf copy of a 30 page long report in Gujarati language. And here is a screenshot and direct link to the land use map that accompanied it -


The map too is in pdf format; a jumble of different layers overlaid on top of each other; with no clear distinction between existing and proposed land-use. 

The map is a cartographic nightmare but that's the least of our worries.

Forget about using available technology to develop a planning approach based on decades of theoretical and practical advances and reflections in the field of planning, we are offered a national level scheme which purports to use GIS but provides us with pdf files that tell us nothing and which we can use for nothing.

The reports and plans of different cities seem to be prepared by different private consultants, each following their own cartographic rules; structure of report; and occupying their own unique positions on the scale of being slapdash.

The only thing that we can be certain of regarding the meaning of the term "GIS based Master Planning" is that all consultants have made some use of GIS software in preparing the maps.

Pardon my French, but what the F*** is the use of that ?

And yet, why is it that we don't burst out laughing and brush aside the moment we are presented with these ludicrous "tech"-loaded planning terms, be it GIS based planning or the galaxy of terms following that other ubiquitous and incomprehensible adjective - Smart ?

What makes us take such tragic farces seriously....discuss them, develop projects around them, organise conferences and webinars on them ?

May be because we are distracted from the real tasks that face us...and alienated from the scienctific knowledge that we need to tackle the urban challenges facing us.

This nonsense seems to be everywhere...and relentless.

But it seems to contain the petrified brittleness of all things insecure...one tiny push and it may crumble to dust.

Then why not give it a resounding whack ?







 

 

   

 

 

 


 




 


Wednesday, January 25, 2023

(Smart when you create...dumb when you consume) <-- How to recognise unnecessary technology and some Jaga Mission Stories


The challenge of humanity, since the industrial revolution, has not been one of scarcity, but one of excess (and of the exploitation and inequality that naturally appear when that excess - which can provide for the whole world - ends up being controlled by the few). 

The trouble with technology is the same. How much is enough ? Where does one draw the line between the need and the want...the useful and the useless ?

Perhaps, there is a simple way to distinguish the technology that one needs from the technology that is unnecessary, superfluous and, in all probability, harmful.

Any technology which makes you smarter while you use it and keeps making you smarter and more creative the more you use it, is a healthy and useful technology for you. The technology that makes you dumb and dependent when you use it, is neither very useful nor very healthy in the long run, irrespective of the convenience that it brings.

Most of the time, we use (rather consume) technology that makes the creator of that technology - and those who control the creators - smarter and more powerful, while making us dependent and constantly distracted (which cannot but cause a steady dumbing down over time).

Technology ...Consumer and Creator

A casual glance at the way people use their computers - one of the most powerful tools of modern technology - can confirm the above statements.

Google-maps may make map-reading, navigation and orientation very easy, but it can also decrease our ability to use our sense of direction, powers of observation and of memory to remember and locate landmarks, judge distances etc.

Typically, when we use google-maps our eyes stay glued to the smart-phone screen (the effect is the same even if we are looking at the road while driving...the app tells us everything), but if we were to go old-style with paper maps we would have to constantly look up from the map to scan the surroundings and ensure that we are at the right spot.

Of course, I am absolutely not suggesting dumping google-maps and returning to paper maps. After all the whole purpose of technology is to make life more convenient for humans so that they don’t have to engage in ceaseless manual labour and mundane tasks and engage in more meaningful pursuits instead.

But if the consequence of such “convenience” is a steady process of dumbing down and engaging in such "meaningful pursuits" like spending hours on social media and the "struggles" of becoming an influencer, then it might actually be healthier and more meaningful to return to a life of heavy manual labour (if one can, that is).

The idea is not to turn ones back on technology (it simply cannot be done), but to be aware of the manner in which the technology is owned, controlled and provided to us so that we can be conscious of its effects on us.

GPS and the Kargil Lesson

Talking about the conveniences of positioning systems, that is exactly the lesson that the Indian army learnt the hard way during the Kargil war of 1999. India was denied access to the Global Positioning System (GPS) by USA exactly when she needed it most in the context of high-altitude mountain warfare. The Kargil war was also a trigger event that led to the development of NavIC (’Navigation with Indian Constellation’...the word ‘navik’ means sailor in Hindi), India’s alternative to the GPS. 

Therefore, the simple learning that the above case provides, is that it is alright to be a user of technology, as long as you also play a role in developing it (or at least understanding how it operates), but it can be downright dangerous if you forever remain merely a consumer of technology.

Still, no matter how dependent google-map may make you, it is still a very useful tool. What arguments could one possibly offer to justify the helpless addictions that are caused by the largely useless social-media platforms ?

I am sure the people who develop these platforms continue to sharpen and develop their skills in programming and problem solving, whereas the users continuously lose the ability to use the computer for the main task it was created to perform – to compute. On top of that, the more they use...the more they generate data for these very same companies.

One cannot wait for society to change to protect oneself from such devastating trends...one simply has to jump off this crazy train oneself.

Linux and Synaptic Connections

For me that jump was in April 2016, when I made the switch from Windows to Linux...and never looked back.

I realised that the users of open-source operating systems and software, inevitably, start transforming into developers with time.

Or as my dear friend Titusz Bugya, who introduced me to Linux and taught me pretty much everything that I know about the proper use of computers, once put it jokingly-

"Linux IS user-friendly !! It is a friend of the User...not of the idiot !"

As the Linux beginner starts overcoming the hesitation and fear of the terminal window and has the first conversations with the computer using the command line, s/he begins to hone that most essential and fundamental skill required for solving a problem, no matter how complex -- the ability to formulate a question.

The clearly formulated question leads to the precisely formulated command and that leads to the desired result.

Here, the Unix philosophy, which is also used in Linux, of using programs that do only one thing and do it very well becomes a great tool. It encourages you to break down complex tasks into component parts - which by themselves may not be as overwhelming - and then deploy the appropriate programs to tackle them one at a time. 

It is not just about approaching and successfully completing a task, but about developing a certain way of thinking and approaching a problem - or as the geo-political expert Andrei Martyanov put it in his brilliant book - "to develop complex synaptic connections which are applicable for everyday life."

Some more hands-on stuff...(OR) how Linux helped Jaga Mission in Odisha

In the previous post I had started discussing about the Linux command line and the incredible flexibility and power it provides to the user. 

Using the command line, and progressing (which happens quite naturally) towards scripting and programming, also halts and reverses the "Smart when you create - Dumb when you consume" process.

The simple fact is that we can't depend on an external IT specialist or a ready-made software for most of the problems that we face regularly in our work.

Only we know the specific problems that we face in our particular work environments - and they may pop up anytime. It is impossible to out-source all such problem situations to an external software consultant.

Similarly, there may be many tasks at work, which could be solved and/or automated through the command line or scripts (a series of commands written down in a file for execution). I have already showed some examples in the previous post

In this post let me show another example of a slightly higher order of complexity than the ones I showed earlier.

The implementation of Jaga Mission, the flagship slum improvement project of the Government of Odisha, where I worked as a consultant urban planner, involved the creation of a pretty huge geo-spatial database.

In the first phase of the project, about 2000 slums located in 109 cities and towns of the state were mapped using quadcopter drones. The very high resolution (2.5 cm) imagery was geo-referenced and digitized to create the necessary layers of geo-spatial data layers. 

The following were the major data layers that were prepared for each slum settlement -

a) the high-resolution drone image

b) layer showing the individual slum houses

c) layer showing the slum boundary

d) layer showing cadastral (land ownership/tenancy) data corresponding to the extent of the slum settlement.

e) layer showing the existing land-use of the slum settlement

This led to the creation of a pretty substantial geo-spatial database of about 10,000 map layers. In an earlier blog on operational parameters, I have explained how this database was crucial to fulfilling the goal of Jaga Mission of granting in-situ land rights to slum dwellers. 

The geo-spatial data was particularly useful when encountered with complex situations, such as slums located on certain specific categories of land, where granting in-situ land rights may not be possible. 

When this data was handed over to the Jaga Mission office by the technology consultants, the data-sets were organised in a manner which made quick retrieval and analysis difficult.

The individual layers were stored in a series of folders and sub-folders in a manner as shown in the diagram below -

 



In order to retrieve any layer of a particular type (say, the slum household map) of any slum, one would have to first open the folder of the respective district; then the folder of the respective ULB (Urban Local Body i.e. the city) ; then the folder of the respective slum and then the necessary layer(s).

The file names of the individual layers just mentioned the type (e.g. "hhinf" for the household layer; "rplot" for the revenue plot/cadastral layer etc), without giving any further information suggesting the name of the slum or city.

While this is absolutely fine for manually retrieving the separate layer files and operating on them on a Geographic Information System (GIS) software, this method of data management is incompatible with any attempt at programming, automating or quick retieval.

And when we are dealing with 30 districts; 109 ULBs; 2000 slums; and 10000 data layer, then quick and precise retrieval is essential. Any kind of programming or process automation could also be extremely useful. 

For example, it was decided by the Government that slums located on land belonging to the Railways may need to be re-located to alternate sites. The process could be done by filtering the cadastral layers based on land ownership by the Railways and then selecting the houses which intersect with those parcels from the slum-households layer. 

However, given the manner in which the files were named and organised, this process would have to be done manually on a slum by slum basis. In the absence of an army of GIS technicians (something that the Jaga Mission did not possess), the process was bound to revert to an even more laborious process of municipal staff and revenue officials physically visiting the slums and checking if they were located on railway land.

It was almost as if the elaborate digital database had never been created.

Titusz and I wished to rename and re-organise the data-files in a manner which would enable near instantaneous retrieval and processing. But, of course, even to rename the files (in order to enable scripting), we would need - you guessed it - scripting !....or else how to rename 10000 files stored in separate folders and sub-folders ??

So, we wrote a script which would loop over each of the 30 district folders and recursively go down each sub-folder until it reached the bottom-most level where the data files where stored. 

Every time the script would move down a folder level, it would store the name of the folder as a variable. Once it reached the level of the individual file it would rename it by adding the relevant stored variables as prefixes to the original name of the file. The resultant file name would therefore contain the name of the city, the name of the slum and the type of the data layer (there was no need to add the name of the district to the file name).

The following diagram shows the concept behind the script -

 

Once this process was completed, there was no need to store the files in separate folders and sub-folders. They could be kept in a single folder and files of any combination of city name, slum name and type could be retrieved instantly.

Not only did we have fun trying to create a script that would solve our problem by making use of the names of the very folders in which they were stored (which was precisely the problem that we were trying to solve !), but we also ended up creating a fresh system which drastically reduced the time taken for analysis and decision-making regarding all future tasks.

Effectively, we used the problem to solve the problem.

As a direct consequence, it reduced the burden of manual labour which would have fallen on the shoulders of municipal workers and also reduced the problems faced by slum dwellers due to incorrect decision-making in a process as challenging as relocation.


More on those stories in the forthcoming blogs...





 
 

 

 








Monday, January 16, 2023

With great power comes great...Idiocy !


One of the tragedies of computers in the present times is that they are almost always used for things which they are not supposed to be used for. 

As the name suggests, the primary task of the computer is to compute - to take over the boring, repetitive and labour consuming computing tasks of human beings so that the species can focus on more creative and meaningful pursuits (i.e. human pursuits). 

However, if I were to observe the use of the computer by development sector managers and professionals, it would appear sans doute that the electronic computer was created for the sake of making unnecessarily heavy and pointless graphics loaded power-point presentations. 

Indeed, if one counts the amount of hours (sometimes, all the hours) that young professionals spend adjusting images, animations and texts on their power-point presentations one wonders whether computers have increased office-based manual-labour by orders of magnitude instead of reducing it. 

Of course, a major cause of this, which I have already written about in a previous blog , is the continued dependence on proprietary software despite the steady growth of open-source software and operating systems. 

While proprietary software and operating systems treat computers users primarily as consumers of technology (distracting them with ever flashy and "user-friendly" software products, which are designed to make the consumer feel tech-savvy while simultaneously building a technological dependence akin to substance addiction), open-source software and operating systems encourage users to gradually transform into a free community of developers...liberating them from the addiction of any specific product created by a company (for its own profits of course) and enabling them to create their own products which are suited to their needs.


The super-computers in our pockets

In his brilliant talk titled "You should learn to Program", computer scientist Christian Genco said something, which was quite eye-opening and embarrassing at the same time. 


He showed a photo of the Apollo Guidance computer that was created by NASA engineers to do the complex calculations necessary for the Apollo-11 moon-landing mission. He then flipped out a smart phone from his pocket and said, 

"Your cell phone in your pocket right now, has the computing power to do the calculations of a million Apollo-11 missions simultaneoulsy ! NASA scientists in 1961 would have fallen to their knees and worshipped you for having this kind of technology !"


The Apollo Guidance Computer 


Christian then wrapped up the irony by showing a clip of a little video game and said,

"And what are you using it to do ??"


Here we are, faced with a zillion challenges that are hammering our clueless city governments from all sides....the challenge of affordable land; of urban poverty; of the vulnerability of the informal sector; of climate; of traffic; of housing...(the list goes on and on)...and we use one of the most powerful tools ever created by human beings to play video games, spend hours on social media, make colorful powerpoint presentations etc.

Enter the Command Line...may the Force be with You

Many years ago, when I was in high school, it was screens like the one shown below that made me run away from computers.


 

Little did I know that many years later this grim, dark screen - known as the terminal window - would turn into my biggest ally. I shall tell the story of my introduction to Linux in another blog. But for now, let us talk a bit about this dark screen. The terminal window is the one in which you can type in commands to your computer...when the computer responds, i.e. executes your command, you begin to sense both the true power of the machine and of yourself.

Of course, in the beginning the advantage is not very evident and I wondered why I should not go click-click with the mouse on the graphical user interface (GUI). After all, that is the most familiar and convenient way of interacting with computers running Windows on them. 

But very soon the things you can do with the command line starts overtaking all that you can do with your mouse...and then, suddenly, it goes totally beyond the reach of the mouse as if a spaceship from "Star Trek" just jumped into warp speed and disappeared from sight.

Let's say, I want to create a folder called "data" using the command line. I would type the following at the command prompt and press enter -

mkdir data

On the terminal it would look like this -

 


 

But I can easily do the same by right clicking on the windows file manager and then selecting the option for creating a new folder right ? 

But let us now try to make five folders with names data-1; data-2; data-3 etc. Now it begins to get slightly inconvenient to use the mouse.

But on the command line you would just have to type the following line and press enter -

 


And it's done.

But now....let's make 500 folders !

On the command all that you have to type is the tiny command-


That's it ! When you open your file manager, you see this waiting for you -



You have activated warp drive and left the mouse somewhere far behind in the universe. 

Now, imagine that these folders have been filling in over a long time with all kinds of files (word files; excel files; image files; pdf documents etc) concerning your work.

You would create a new folder called data-pdf and copy just the pdf files into it. It may be that all the 500 folders could contain pdf files or only some of them...you cannot be sure. What would be a mouse based way to do this ? Open each folder, select the pdf files and then copy them into the new folder. 

Or you could type the following two commands -

 


The first command -- mkdir data-pdf -- makes a new folder with the name data-pdf.

The second command -- cp  -r */*.pdf  data-pdf -- uses a command called cp (copy) to recursively (i.e. goes inside each and every folder) check for files ending with a pdf extension and then copy just those files into the newly created folder called data-pdf.

 

Well, well ! Now that is something isn't it ? We asked the computer to perform quite a complex search-n-retrieval task and it did it...in much much shorter time than the blink of an eye. 

And we did this with just two lines of ultra-simple commands.

Imagine what all we could do with a series of such commands.


A series of commands tied together....well, that's a program !!

If you could do so much with these mighty single-line commands (and they are mighty as we shall see)....what all can you do with a bunch of them tied together !!

That's what we will explore here.


(to be continued...)