Followers

Showing posts with label Information Technology. Show all posts
Showing posts with label Information Technology. Show all posts

Monday, June 26, 2023

Top 10 technology to learn for significant professional growth in 2023

 Technology is constantly evolving and has become an integral part of our daily lives. As we move into 2023, it is no secret that technology is changing the way we live, work and interact with one another. With technological advancements occurring at a rapid pace, it can be difficult to keep up with the latest trends. However, staying informed and up to date with them can be a huge advantage, especially when it comes to career growth.

Here are 10 booming technologies that are expected to drive significant professional growth in 2023.

Artificial Intelligence (AI) and Machine Learning (ML)

These are two of the fastest-growing technologies today. AI involves the creation of intelligent machines that can work and think like humans, while ML involves the use of algorithms to help machines learn from data and improve their performance over time. With these technologies, machines now have become capable of performing tasks that previously required human intelligence and decision-making.

The use of AI and ML is becoming increasingly popular in various industries, including finance, healthcare, retail, and manufacturing. Companies are now looking for professionals who can help them integrate AI and ML into their processes to increase efficiency and productivity.

Internet of Things (IoT)

IoT refers to a network of interconnected devices that can communicate with each other and exchange data in real-time. This technology has revolutionized the way we live and work, and it is expected to have an even greater impact in the coming years, especially in industries like healthcare, manufacturing, and transportation.

And as it continues to grow, there will be an increasing need for professionals who can develop and maintain these systems. Jobs such as IoT developers, network architects, and data analysts will be in high demand in the coming years.

Blockchain

This technology has been gaining popularity in recent years due to its ability to provide secure and decentralized record-keeping. Blockchain is being used in a variety of industries, including finance, healthcare, and logistics.

And as it continues to grow, it is expected that there will be an increasing need for professionals who can develop and implement this technology. Jobs such as blockchain developers, security analysts, and project managers could be in demand in times to come.

Cybersecurity

Cybersecurity is becoming increasingly important as we continue to rely on technology to conduct our daily lives. With the rise of cyber threats, companies are now looking for professionals who can help protect their systems and data from cyber-attacks. Therefore cybersecurity jobs such as security analysts, network security engineers, and ethical hackers will be great career choices for professionals in the coming years.

Cloud Computing

Cloud computing has been a game changer for businesses, allowing them to store and access data from anywhere in the world. This technology has revolutionized the way we work and is being leveraged by various companies across healthcare, finance, and e-commerce.

Therefore jobs such as cloud architects, cloud engineers, and cloud security specialists are expected to witness an uptick in the coming years.

Augmented Reality (AR) and Virtual Reality (VR)

Although in their infancy, both these technologies have the potential to change our perception of the world around us. AR involves the overlay of digital information onto the real world, while VR involves the creation of a completely immersive digital environment.

These technologies are being used in a variety of industries, including gaming, education, and healthcare with high degrees of success. Therefore from a career perspective, jobs like AR/VR developers, UX designers, and 3D artists will prove to be a good investment.

Data Science and Analytics

Data Science and Analytics are two fields that are becoming increasingly important as companies strive to make data-driven decisions. Data Science involves the extraction of insights from data, while Analytics involves the use of statistical and mathematical methods to analyze data and make predictions.

As companies continue to collect vast amounts of data, there will be an increasing need for professionals who can extract insights from this data. From an employee point of view investing time and effort in profiles such as data scientists, data analysts, and data engineers will prove beneficial in coming years.

Robotics and Automation

Robotics and Automation have had a large-scale impact on the manufacturing and logistics sectors. These technologies not only allow machines to perform tasks that would otherwise require human intervention but also allow for the use of software to automate repetitive tasks.

Quantum Computing

Quantum Computing is a technology still in its early stages, but it has the potential to revolutionize how we process and store data. It basically involves the use of quantum-mechanical phenomena, such as superposition and entanglement, to perform computations. And given its vast growth potential, there is expected to be an increasing need for professionals in the future who can develop and implement this technology.

5G Technology

5G Technology is the latest generation of cellular networks, offering faster speeds, lower latency, and increased capacity. By enabling the development of new applications and services, 5G is expected to revolutionize the way we interact with the world around us.

Staying up to date with the latest trends and technologies can be a huge advantage, especially when it comes to career growth. Whether you are a seasoned professional or just starting out, investing in developing skills in some of these areas can help you stay ahead of the curve and secure a successful career in the tech industry.

Gaurv Bhatia

Source: The Telegraph, 27/05/23

Thursday, August 11, 2022

Change is constant

 The mobile phone wins hands down — from communication to storage, from entertainment to learning, it is all on your phone


Some of us were more disturbed than impressed when we read a poem in The New Yorker written by an Artificial Intelligence bot. The last two lines of the poem on cryptocurrency are chilling: “Of inventing money, just like that,/ I ask you, is nothing sacred?” The AI, code-davinci-002, had been ordered to write the poem in the style of Philip Larkin and it was written in less than a second. “We are being replaced by a button,” a friend remarked ominously. Is the time imminent then that our students will produce an essay or a poem with a quick ‘command’ and we teachers will be reduced to teaching them how to give the right prompts? We don’t really know, but we can certainly try to understand the implications of the signs of the times. 

The toddler rushes to the door every time the bell rings, hoping it is ‘Amazon Uncle’. He has never been taken shopping because of the pandemic, not even to a glittering mall. No worries. All his toys and clothes are delivered to the door.

Attendance in some schools is through biometric devices. I doubt that we will see the old school register with students’ names laboriously handwritten in alphabetical order very much longer and the familiar response, ‘Present, Sir’, will not be heard. I also see the disappearance of the greetings ‘Good morning’ and ‘Good evening’. It is being gradually replaced with ‘Hi’ or ‘Hello Ma’am’ — the teachers’ responses to this are mixed. Talking about forms of address, first names are used these days without so much as a ‘by your leave’ but caution is exercised where gender is involved. No longer is one limited to ‘he’ or ‘she’ as some prefer the inclusive word ‘they’. The Bengali term, ‘aapni’, is hardly used by young people — the informal, ‘tumi’ or ‘tui’ are far more prevalent. I think that this indicates a preference for an informal conversational style rather than a lack of respect.  

How do the young of today relate to books? Many kindergarteners would much rather look at their tablets than go through their static but colourful books. Even senior students admit that they find it difficult to ‘process’ their text or reference books or type-written documents. According to them, moving images, sounds, animation, and movie clips, along with bulleted points and handy notes (as seen in many YouTube lessons) make studying much easier. School libraries have begun to house digital material in the form of audio books, films, podcasts, and video lessons. Recently, I happened to view the digital collections in a public library exhibition entitled Treasures and indeed they seemed as valuable as the old manuscripts on display.

Tasks are executed differently too, with Siri or Alexa serving as a useful helpmate. (Worryingly, they even serve as companions to some lonesome youngsters.) For various assignments, screenshots of the design or plan are prepared in advance and then the finished product is presented. Practice sessions of programmes are video-filmed and played back for comments and advice. Students in drama and elocution classes not only learn to throw their voices but are also taught how to modulate them while using microphones. Sport is becoming increasingly fine tuned and those inclined towards games and athletics select their respective areas of specialisation early in life. Sadly, we hardly find children playing a sport for sheer pleasure.

The late Sir Ken Robinson, one of the greats in education, once stated that the young don’t wear watches any longer as they are ‘single-function’ devices. This is not true now: we find more and more young people sporting Fitbit watches to keep track of their fitness regimen — incidentally, these watches tell the time too. But as the most useful device, it is the mobile phone that wins hands down — from communication to storage, from entertainment to learning, it is all on your phone. These are some of the signs of our times. Whether we like them or not, changes will keep coming fast and furious. I have had to adapt to these changes so rapidly in recent years that there was no time to ponder on the good old days. But if I live a little longer, I hope to dwell deliciously on a slower and more intelligible time.

Devi Kar

Source: The Telegraph, 11/08/22

Monday, August 08, 2022

Learning machines

 The economic downturn caused by Covid-19 was the making of one class of business: the edutech industry. The closedown of schools created a need to teach students remotely. The electronic mode was the only possible means. But the way it was adopted prompts deep misgivings.

I am actively involved with computer applications in teaching and research. The promise held out by digital learning excites me. Its progress in India fills me with alarm.

The dismal backdrop to my discussion is the digital divide. We are content that for the poor, a single smartphone should be considered a sufficient educational tool for all students in a household. Even that, a parliamentary committee found last year, eluded 77 per cent of the nation’s children.

But today, let us think about the fortunate ones with laptops and smartphones for their sole use. When the pandemic broke, their schools soon switched to online classes. But online teaching implies more than a Zoom meeting. It calls for audio-visual techniques for which most schools had neither expertise nor infrastructure. Plain vanilla classroom teaching falters without a classroom. 

That is where edutech companies saw their chance. They applied digital technology expertly and intensively to the curricular content. Their instructors exuded a compelling onscreen presence, as conventional teachers had never learnt to do. The result was a package that captivated both children and parents footing the bill. Both parties were connoisseurs of onscreen content: the children from computer games, the parents from infotainment channels. The superstition is rife anyway that anything emerging from a computer is a superior option. In two short years, hitherto uncontested schooling methods acquired the negative label of ‘offline teaching’.

But might not the new technology truly be superior? The digital revolution has transformed our lives. In intellectual and cultural matters, however, it has generally modified older practices instead of dislodging them altogether. More books are printed today than ever before, alongside the electronic text and the internet. Live performances flourish despite staggering advances in audio-visual recording. The equation between ageless human practice and digital innovation is subtle and complex. With education, the pandemic drastically short-circuited this adjustment.

Throughout history, teaching has implied an interaction between teacher and student. A child learns letters and numbers under a teacher’s care among a group of peers. Every primary-school teacher I have asked agrees that small children cannot be taught online to read, write and count. If some learn to do so, it is because an adult is present to guide the process.

With older children, the challenge is subtler. Edutech planners will tell you that they allow for individual attention and interaction. Learners can follow their own pace, assess themselves by self-testing, and even ask questions. The interaction is largely through precoded exercises and bots, but the best (and costliest) courses find slots for human mentors. Yet all these features are worked into a pre-set, one-way system: an extended IT program, ‘remote’ in every sense.

To be sure, there are physical schools so ill-run that online instruction is a better alternative. But even a halfway decent institution offers the imperative human exchange. A lecturer in a classroom subconsciously attunes herself to the faces in front of her. Students’ queries cover a range that artificial intelligence cannot tackle — above all because it ignores individual psychology, the personal factors impacting a student’s development. A packaged online program can never overstep its boundaries, never warm to a bold question or an out-of-the-box suggestion. At most, it fosters a competent mediocrity. Hence the best students benefit the least from online courses,  which stunt their potential.

Edutech is the white flour and refined sugar of learning. To consume it is better than to starve, but it is no substitute for a wholesome home diet, even if indifferently cooked. (That is no excuse not to improve the cooking.) To vary the image, the stuff of digital learning is both literally and metaphorically behind a screen: you see it, but you can’t reach through and grasp it.

Such charges are customarily made against private coaching. Coaching centres are reviled on principle but rife in practice. Edutech providers profess the same adjunct role. But given their reach, glamour and opulence, they play a much more visible and increasingly central role in India’s education system.

This is because they blend with the current ecology of public services, cutting down State forests and planting corporate groves. Online teaching is vastly cheaper to provide: it does not need a standing army of teachers. The high demand is fanned by both commercial and official publicity. The Union government has perfected a new rhetoric extolling online teaching, never mind the digital divide. PM eVidya, the grandest of many schemes, aims to provide online education to every student in India. This may or may not be the same as the ‘digital university’ promised in this year’s budget, while actual universities languish for want of funds.

Education is following the path of our healthcare services, with an endlessly expanding role for the private sector. The economics drives the technology. State agencies have their own e-learning platforms: Diksha, ePathshala and Swayam, among others. Yet our rulers are warming more and more to private operators. Universities can outsource 40 per cent of course content for online degrees (themselves a recent innovation) and engage edutech companies to ‘assist’ even with the rest. There is even talk of such companies carrying out evaluation.

In today’s India, practices once thought harmful or illicit are routinely legitimised and then made standard. Not so long ago, we deplored the possibility of commercial coaching empires influencing exam results and curricula. This might soon become normative and organic to the system.

No academically respectable country has surrendered its education sector to profit-seeking interests in this way. When all is said, Indian education has an honourable place in the world’s eyes. We denigrate our public education system, but its alumni win success and acclaim everywhere. Let us not sell out on that legacy. 

Sukanta Chaudhuri is Professor Emeritus, Jadavpur University

Source: The Telegraph, 8/08/22

Tuesday, July 19, 2022

How QR codes work, and how they are hacked

 The ubiquitous QR code was invented in 1994 by Japan’s Denso Wave; company engineer Masahiro Hara created it originally with the intention to make manufacturing operations more efficient

In this era of digitalisation, there is never a day that passes without the use of a QR code. This technology has become a part of our lives, more so after the COVID-19 pandemic in 2020 with an emphasis on going contactless to avoid the spread of the deadly virus.

The ubiquitous QR code (quick response code) was first invented in 1994 by Japan’s manufacturer Denso Wave. August 8, 2021, marked the 27th anniversary of the QR code.

The QR code was developed by Denso company engineer Masahiro Hara, originally with the intention to make manufacturing operations more efficient.

According to Denso, it decided to make the technology license-free in order to encourage its use by as many people as possible and released QR codes for general use.

What is a QR code?

It is a type of barcode with a series of black pixels in a square-shaped grid on a white background. It contains various forms of data, like website links, account information, phone numbers, or even coupons.

Unlike the standard barcodes that read in only one direction – top to bottom and store only less amount of information, QR codes are two-dimensional (2D). QR codes can be read in two directions – top to bottom and right to left. This allows them to store more data – 7,089 digits or 4,296 characters. They use approximately 10 times less space than a traditional barcode.

A QR code can encode numerals, alphabetical characters, symbols, binary data, control codes and other data. They can be read at high speed regardless of the scanning angle. The secret lies in three position detection patterns, which are located in each code, enabling stable high-speed reading without being affected by the background patterns.

Position detection pattern

The most challenging problem for the development team of the QR code was how to make 2D codes read as fast as possible; it is more difficult for scanners to recognise the location of a 2D code than that of a barcode. One day, Hara hit on the idea of adding, to the code, information that indicates its location, which might solve this problem.

Based on this idea, a position detection pattern, located at three corners of each code, was created. He expected that by incorporating this pattern into a 2D code, a scanner could accurately recognise the code and thereby read it at high speed.

However, developing the shape of the position detection pattern was extremely difficult because when a similarly shaped figure was near the code, the pattern could not be recognised accurately. To prevent false recognition, the position detection pattern had to have a unique shape.

“The development team members began an exhaustive survey of the ratio of white to black areas in pictures and characters printed on leaflets, magazines, corrugated cartons and other documents after reducing them to patterns with black and white areas. They continued to study numerous printed matter day and night, and at last, identified the ratio that least appeared on the printed matter. It was 1:1:3:1:1. In this way, the widths of the black and white areas in the position detection pattern were determined and scanners became able to detect the code regardless of the scanning angle by finding this unique ratio,” the company explained.

How do QR codes work?

According to anti-virus provider Kaspersky, the patterns within QR codes represent binary codes that can be interpreted to reveal the code’s data. A QR reader can identify a standard QR code based on the three large squares outside the QR code. Once it has identified these three shapes, it knows that everything contained inside the square is a QR code. The QR reader then analyses the QR code by breaking the whole thing down into a grid. It looks at the individual grid squares and assigns each one a value based on whether it is black or white. It then groups grid squares to create larger patterns.

Parts of a QR code

A standard QR code is identifiable based on six components: Quiet Zone, Finder pattern, Alignment pattern, Timing pattern, Version information, and Data cells, said Kaspersky and explained the following.

  • Quiet Zone: This is the empty white border around the outside of a QR code. Without this border, a QR reader will not be able to determine what is and is not contained within the QR code (due to interference from outside elements).
  • Finder pattern: QR codes usually contain three black squares in the bottom left, top left, and top right corners. These squares tell a QR reader that it is looking at a QR code and where the outside boundaries of the code lie.
  • Alignment pattern: This is another smaller square contained somewhere near the bottom right corner. It ensures that the QR code can be read, even if it is skewed or at an angle.
  • Timing pattern: This is an L-shaped line that runs between the three squares in the finder pattern. The timing pattern helps the reader identify individual squares within the whole code and makes it possible for a damaged QR code to be read.
  • Version information: This is a small field of information contained near the top-right finder pattern cell. This identifies which version of the QR code is being read.
  • Data cells: The rest of the QR code communicates the actual information, i.e., the URL, phone number, or message it contains.

Types of QR code

QR codes can be used for multiple purposes, but there are four widely accepted versions of QR codes. The version used determines how data can be stored and is called the “input mode”. It can be either numeric, alphanumeric, binary, or kanji. The type of mode is communicated via the version information field in the QR code.

  • Numeric mode: This is for decimal digits 0 through 9. A numeric mode is the most effective storage mode, with up to 7,089 characters available.
  • Alphanumeric mode: This is for decimal digitals 0 through 9, plus uppercase letters A through Z, and symbols $, %, *, +, –, ., /, and : as well as a space. It allows up to 4,296 characters to be stored.
  • Byte mode: This is for characters from the ISO–8859–1 character set. It allows 2,953 characters to be stored.
  • Kanji mode – This is for double–byte characters from the Shift JIS character set and used to encode characters in Japanese. This is the original mode, first developed by Denso Wave, according to Kaspersky.

Are QR codes safe?

Kaspersky warns that attackers can embed malicious URLs containing custom malware into a QR code which could then exfiltrate data from a mobile device when scanned. It is also possible to embed a malicious URL into a QR code that directs to a phishing site, where unsuspecting users could disclose personal or financial information. Because humans cannot read QR codes, it is easy for attackers to alter a QR code to point to an alternative resource without being detected.

Can QR codes be hacked?

“The QR codes themselves can’t be hacked – the security risks associated with QR codes derive from the destination of QR codes rather than the codes themselves. Hackers can create malicious QR codes which send users to fake websites that capture their personal data such as login credentials or even track their geolocation on their phones. This is why mobile users should only scan codes that come from a trusted sender,” says the company.

Source: The Federal, 19/07/22

Monday, June 06, 2022

How barcodes differ from radio-frequency identification tags

 Baggage tags equipped with radio-frequency identification (RFID) will soon be available at Delhi’s Indira Gandhi International Airport, marking a first of its kind for the country.

What is RFID technology? What’s the difference between RFID and a barcode? Is RFID is enhanced version of barcode? The Indian Express explains:

What is Radio-frequency identification (RFID) technology?

Radio-frequency identification (RFID) is a technology that uses radio waves to automatically identify various tagged objects. Radio Frequency Identification (RFID) is a wireless tracking method that uses tags and readers to track objects. Transponder, receiver, and transmitter are the three components of an RFID system.

The RFID reader continually sends radio waves of a specific frequency in RFID system. If the object to which the RFID tag is attached is within the range of the radio waves, it provides feedback to the RFID reader, which then identifies the object based on the feedback.

What are the different kinds of RFID?

Passive tags, semi-passive tags, and active tags are the three types of RFID tags that are commercially available.

There is no power supply for passive tags. They acquire their power from the readers’ incoming radio waves.

Semi-passive tags comprise an internal circuit with a power source, but rely on the radio waves received from the reader to transmit the response.

The internal circuit of active tags is powered by a power source.

Passive RFID tags do not have a battery and are powered by the reader.

Batteries are being used to power active RFID tags. It also utilises its own power supply to send the response to the reader.

The Low Frequency, High Frequency, and Ultra-High Frequency bands are used by RFID systems.

What is a barcode?

A barcode is a printed series of parallel bars or lines of varying width used for entering data into a computer system. The bars are black on a white background and vary in width and quantity depending on the application. The bars represent the binary digits zero and one, which represent the digits zero to nine processed by a digital computer. These barcodes are scanned using special optical scanners known as barcode readers, which come in a variety of shapes and sizes. The majority of these codes use only two different widths of bars, however some use four. The numbers that make up a barcode are also printe

Radio-frequency identification (RFID) technology Vs barcodes

RFID uses radio waves to communicate data from RFID chips to readers that do not require line of sight in order to obtain the data, whereas barcodes use light to read the black-and-white pattern printed on the sticky tag. An RFID tag can communicate with a powered reader even when the tag is not powered.

When printed on paper or sticky labels, barcodes are more susceptible to wear and breakage, which can affect their readability. RFID tags, on the other hand, are sometimes placed in plastic labels or into the object itself, making them more durable than barcodes.

In contrast to barcode scanners, RFID scanners can process dozens of tags in a single second. Also, barcodes are simple and easy to copy or counterfeit, whereas RFID is more complicated and difficult to replicate or counterfeit.

Unlike barcodes, which must in line of sight, RFID tags need not be.

Also, RFID tags are expensive compared to barcodes.d at the bottom. One of the most well-known examples of a barcode is the QR code.

Is RFID enhanced version of barcode?

Since their introduction in the 1970s, barcodes have become an indispensable part of commercial activity on a daily basis, whether in grocery stores or at airports.

When it comes to speed, there is a noticeable difference between barcodes and RFID. This is because barcodes must be read manually, making them more susceptible to human error and more difficult to evaluate their accuracy.

However, RFID’s accuracy may be compromised if the tags are applied to metals or liquid. Since RFID frequencies can be transmitted over greater distances than barcode frequencies, there is also concern that RFID technology raises data protection issues, resulting in personal information becoming accessible without consent.

When deciding whether to use barcodes or RFID, it is important to consider the purpose, the environment, and the potential costs of an application.

Written by Priya Kumari Shukla

Source: Indian Express, 3/06/22


Wednesday, February 16, 2022

An optimal balance of autocracy and ‘vetocracy’ online

 In his article, ‘The Decay of American Political Institutions’, political scientist Francis Fukuyama coined the term “vetocracy" to explain why the American political system was broken. He used the term to describe the political reality today, where the checks and balances originally designed to keep the executive from growing too strong have ossified into a grid-locked decision-making system in which diverse individuals have the power to prevent the implementation of public policies by simply exercising their veto.

The irony is that the veto-based systems of checks and balances Fukuyama refers to were initially introduced to prevent an individual (or small group of individuals acting together) from becoming so powerful as to operate without oversight or accountability. However, in today’s polarized political environment, instead of being used as a legitimate tool of governance, vetoes are used more often than not to make political statements. This, according to Fukuyama, is why in America today, a few powerful interest groups are able to prevent the implementation of various policies that the vast majority of the populace are in favour of.

When we use this lens to evaluate systems of governance, it becomes obvious that these concepts occupy two different ends of the same spectrum. At one extreme is autocracy, a system of governance in which individuals can execute important decisions without asking for permission, even if doing so could be potentially risky and disruptive. At the other end is Fukuyama’s “vetocracy", where any implementation of a new policy requires the sign-off of a large number of diverse actors, any one of whom could single-handedly prevent it from coming into effect.

In a recent article, Vitalik Buterin, creator of the Ethereum blockchain, used this formulation to analyse governance systems in the digital world. He pointed out that while the physical world might, at present, have too much vetocracy, the digital sphere is rife with autocracy. This, he argues, is the reason why technology platforms have been able to wreak such broad cross-sectoral disruption, none of which would have been possible under vetocratic circumstances.

However, Vitalik believes that once the status quo has been disrupted, it is important to ensure that autocratic processes are supplanted by vetocratic systems so that trust in the system can be retained. Failure to do so would result in technology platforms becoming so powerful that they will be able to operate without oversight. This, he believes, is the reason why blockchain-based systems like DAOs (decentralized autonomous organizations) that enable decentralized governance of digital projects have grown in popularity.

Over the past decade, India has witnessed its own unique brand of digital disruption. We’ve built layers of digital infrastructure for public goods, starting with identity and payments and extending, most recently, to data-driven decision making and unbundled commerce. If we have to evaluate the success of these measures, we need look no further than the Unified Payment Interface (UPI) that currently clocks in excess of 3 billion transactions a month.

The ubiquitous adoption of this foundational infrastructure is largely due to the way we leveraged the autocratic inflexibility inherent in code to convince legacy institutions to alter their systems to conform to this new infrastructure’s specifications. That said, rolling out foundational infrastructure is just the first step. As these systems become more widely used, they need to evolve, adding new features and products in response to the demands of an evolving (and maturing) market.

When we have to decide what features should be included (and, more importantly, what should not), we will not be able to use the same autocratic approach we used at launch. There are now a large number of participants who have a real stake in the ecosystem, and any such decision must be appropriately inclusive, taking into account the concerns and misgivings that each of them may have. Unilateral (autocratic) action will erode faith in the system as a whole. At the same time, if we build a purely vetocratic governance system, there is a risk that we will get mired in the sort of stagnation that currently ails the US government.

What we need to do is find an optimal balance that ensures that the system doesn’t fail because a few actors can do bad things unchecked, on one hand, and also prevents decisions in the interests of the entire ecosystem from being held hostage by a few individuals who wield a veto, on the other. We need to offer the vetocratic assurance that vital infrastructure cannot be captured by a privileged few, but at the same time, need to assure the market that innovation will not sacrificed at the altar of consensus.

One way to address these concerns would be to put in place vetocratic processes to protect the institutional core; that is, the central principles that engender trust. In the context of our Data Empowerment and Protection Architecture, this might relate to the principles of individual empowerment and privacy by design that are at the core of its framework. But after we have achieved this central objective, the rest of the governance processes should be relaxed enough to ensure that innovation is not compromised.

India’s digital public infrastructure is universally well regarded. It is important that the governance systems that sustain them should be equally robust. And that will come down to achieving that fine balance.

Rahul Matthan

Source: Mintepaper, 15/02/22

Tuesday, February 15, 2022

How technology is transforming e-commerce experience in the new India

 The term ‘Incredible India’ seems apt when one considers the sheer diversity of languages, cultures and ethnicities across the country. For the retail industry, which contributes approximately 10% to India’s gross domestic product (GDP), the huge consumer market is both an opportunity and a challenge. The challenge is more pronounced in the relatively-untapped rural and semi-urban regions that account for about 65% of the population.

Although the digital transformation of India’s retail industry was already under way in recent years, pandemic-linked tailwinds have accelerated the acceptance of e-commerce among both sellers and consumers. As per estimates, unorganized retail accounts for 90% of the market in India. Of these, online shoppers are said to number 70 million, with only 10 million categorized as ‘digital natives’. Without a doubt, in a nation of more than 1.3 billion people, tremendous scope exists for attracting more online shoppers via personalized products and services, particularly from regional India.

The availability of smartphones and 4G in tier-3+ areas opened up this vast market to e-commerce players. Thanks to the absence of physical stores, online companies offer more value-for-money prices due to their lower overheads. It must be emphasized, however, that value-conscious consumers expect quality products too.

But given the plethora of languages and customs, catering to the remote cohorts comes with unique complexities. In such scenarios, online entities can deploy digital technologies to offer their customers personalized experiences, facilitating higher conversion rates and greater brand loyalty. What’s more, the beauty of online selling lies in the fact that e-commerce portals don’t need to worry about storage space and its allied costs since goods are shipped directly from producers. For sellers, there are no worries about inventory being locked up in warehouses till sales materialize.

While all this sounds kosher, it takes more to convince and convert customers in tier-3 towns and beyond. Therefore, online retailers are using a regional language interface to offer better customer experiences. Additionally, retailers are deploying big data to decode consumer behaviour for providing bespoke offerings. In such situations, a thorough understanding of customer needs helps boost satisfaction and retention rates.

The use of AI algorithms and big data analytics also helps in analysing consumer behaviour through their shopping information such as product preferences, browsing history, etc. Predictive analytics is then leveraged to improve the customer experience by customizing marketing campaigns centred on their habits and needs. For instance, the data of a major value-based online retailer reveals that customers exploring the portal through regional languages end up spending 20% additional time on the platform, which includes higher viewership of product videos.

Accordingly, customer experiences can be personalized by analysing their purchase history to pitch relevant deals and discounts. Besides, the purchases of other consumers in the same demographic cohort can be used to upsell related products.

Moreover, retailers must provide a personalized omni-channel experience whereby a customer can order a product on the website/app and then collect from a brick-and-mortar outlet near them if they so desire. Amidst all this, one must realize the importance of social media as a digital shop floor where sellers can engage with prospective customers directly. Instagram and Facebook are prime examples here with the former offering visual storefronts that help increase customer traction.

Bear in mind, though, that marketers on social media must be digitally-savvy and in sync with customer needs and expectations. Any mismatch in consumer expectations runs the risk of a bad experience being put under the public gaze.

If big data and other tech tools are used judiciously to engage with consumers, they can act as a key differentiator in providing a clear edge to e-commerce players in a highly cluttered and hyper-competitive retail market. In this way, an ongoing relationship can be established with consumers throughout Bharat.

Thereafter, despite the diversity of consumer choices, more satisfaction and increased brand loyalty are bound to follow across non-metro regions. Undoubtedly, technology can offer a winning proposition for customers, sellers, online platforms and other retail stakeholders

Sanjeev Barnwal, co-founder and chief technology officer of Meesho.

Source: Mintepaper, 15/02/22

Friday, January 21, 2022

Seven predictions for the world of technology in 2022

 A quote that is variously ascribed to Yogi Berra, Neils Bohr and even Mark Twain goes something like: “Never make predictions, especially about the future." Regular readers of Tech Whispers, however, have ignored these wise words and have been clamouring for my predictions on technology in 2022. Peering myopically at my personal crystal ball, here is what I see:


Artificial Intelligence (AI) everywhere: Pretty much like digital, or electricity (as Peter Ng said), AI will not be one more thing we do, but will be infused in most objects around us, from cars and phones to TV sets and soon everything else we use. This will usher in the Edge AI revolution, where AI is not in some central server somewhere, but embedded in objects ‘at the edge’. As AI becomes increasingly ubiquitous, questions about ethics in AI usage, responsible AI and explainability will become more strident. I expect one large incident, a Cambridge Analytica scandal of AI, to happen and bring AI ethics into the common imagination.

For better or for Metaverse: The Metaverse, non-fungible tokens (NFTs) and Web3 hype will continue this year, fuelled by crypto ‘bros’ and even other bored apes. There is substance behind the hype—the rise of the creator economy and the proposed decentralization of the web—but there is a lot of fluff too, and that will likely crash and burn. Alongside, crypto will continue to mature, with it becoming more mainstream and some of its real potential getting realized. Here’s a specific prediction: the first $100 million NFT will be sold this year (unless already done by the time this article appears).


Elon Musk rules: 2021 was the year of the entrepreneur behind Tesla and SpaceX, and so will 2022. Musk will continue to reshape energy, cars, space, transportation and other industries; he might even pick a new one to reshape this year. As he does so, he will not only reign as the world’s Tech Overlord, but also give technology a new way of thinking and a new set of rules. He will show how it can be used to remake vast physical and infrastructure businesses. Thus, he will continue to be the richest man on earth, increasing his lead over Amazon’s Jeff Bezos and Microsoft’s Bill Gates.

The pandemic ends: Here is where I am going to truly go out on a limb and say this will be the year that the covid virus would establish an equilibrium with the human race. The Omicron variant will convert the raging pandemic into an endemic, much like the flu, and we will learn to live with it with periodic vaccines. Author Laura Spinney said in her 2018 book Pale Rider that “pandemics end socially, not medically", and that is how this one will peter out too. However, this won’t be the last one, as the ravaging of our planet may prompt newer viruses to consider human hosts.


The rise of green AI/software: The cloud, AI, computers and electric cars are hugely polluting industries, despite the popular impression of their being clean and gentle. Manufacturing one PC needs 240kg of fossil fuels, training one model for natural-language processing emits the same amount of carbon dioxide as 125 New York -Beijing round trips, and the world’s data centres consume almost as much electricity as South Africa does. As this awareness grows, we will see the advent of green AI and software, with governments and corporations starting to mandate this, just as they now do for diversity and inclusion, and environment, social and governance goals. Expect an announcement on nuclear fusion, a technology that could possibly ‘solve’ the global energy crisis.

Crunch times: The two biggest crunches faced by the tech world in 2021 were semiconductors, as global producers struggled with covid-disrupted supply chains and an explosion in demand as the pandemic eased, and an acute shortage of tech workers, as people discovered new ways to work. While the semiconductor crunch will ease, the people crunch will not. Technology is booming, with Big Tech growing rapidly, startups mushrooming and traditional companies going digital. The supply of tech workers cannot keep up, and the astronomical salaries they command will not flag off.


The future of work is here: The pandemic-enforced work-from-home arrangements, continued rise of the gig economy and the emergence of the ‘passion economy’ has ensured that the future that we envisioned for work—work from anywhere, multiple employers, work-life integration and the redundancy of geography—has accelerated into the present. This has led to the Great Resignation and hybrid-work patterns, among other massive disturbances. Expect this to continue in 2022.

As I have written earlier, the covid outbreak has forced us to decentralize more or less everything, be it work, retailing (e-commerce), food (delivery), health (telemedicine) or education (study from home). This Great Decentralization has set a trend that I believe will be irreversible, and this is what is driving up the massive demand for technology and digital transformation, as traditional firms struggle to adapt.

It was another wise person who said, “Any believable prediction of the future will be wrong. Any correct prediction of the future will be unbelievable." Which one of the two these are, we will have to wait till year-end to find out. Let 2022 be a good one.

Jaspreet Bindra is the chief tech whisperer at Findability Sciences, and learning AI, Ethics and Society at Cambridge University.

Source: Mintepaper, 20/01/22


Wednesday, November 03, 2021

Top 10 IT Issues, 2022: The Higher Education We Deserve

 The EDUCAUSE 2022 Top 10 IT Issues take an optimistic view of how technology can help make the higher education we deserve—through a shared transformational vision and strategy for the institution, a recognition of the need to place students' success at the center, and a sustainable business model that has redefined "the campus."

"There will never be a return to what we knew as normal," a university president stated during one of this year's IT Issues leadership interviews.Here, as we begin another year of the COVID-19 pandemic, we all recognize that the higher education we knew will not return. The past two years have served as an inflection point at which the much-discussed and much-debated transformation of higher education has accelerated and proliferated.

Another leader, a chancellor, said: "The best opportunity is to redefine education right now. What does higher education look like in a post-COVID world?" The leaders we interviewed are not reflexively reacting to the changes in the world and simply watching their institutions adapt in response. Instead, they are redefining the value proposition of higher education by reshaping institutional business models and culture to anticipate and serve the current and emerging needs of learners, communities, and employers. Rather than working to restore the higher education we had, they are creating the higher education we deserve.

What is the higher education we deserve? One leader emphasized transformed teaching and learning: "I believe that we have the opportunity to reconceptualize how it is that we are no longer going to be in front of the classroom but, instead, we're going to be facilitators of knowledge."

Another leader described a more "customer"-focused institution: "Universities are going to have to become increasingly commercially-minded and agile and adjust much more to what students want and to what employers and governments are asking from higher education as well. The successful institutions will be the learning institutions that are able to respond more dynamically and be more agile in terms of their response, compared with those universities that are less reflective, less able to change."

Another president emphasized the need for colleges and universities to differentiate themselves. "One of the criticisms of higher education is that it is excessively homogenous. There is substantially less choice for people who want to engage with higher education than you might expect. We need to start carving out areas of very distinctive expertise and advantage and then plug those, in a modular way, into much bigger programs of work. I think the biggest transformation will be the move away from the cookie-cutter institutions that attempt to be all things to all people toward players who really carve out and dominate more spaces. And I think that's going to be a tricky journey."

Each leader defined the new higher education a bit differently, but all recognized that the higher education we deserve cannot be created without technology. In fact, for the first time ever, most leaders spoke of technology not as a separate set of issues but as a driver and enabler of, and occasional risk to, their strategic agenda.

The 2022 Top 10 IT Issues describe the way technology is helping to make the higher education we deserve.Footnote2 Making the higher education we deserve begins with developing a shared transformational vision and strategy that guides the digital transformation (Dx) work of the institution. The ultimate aim is an institution with a technology-enabled, sustainable business model that has redefined "the campus," operates efficiently, and anticipates and addresses major new risks. Successfully moving along the path from vision to sustainability involves recognizing that no institution can be successful and sustainable without placing students' success at the center, which includes understanding how and why to equitably incorporate technology into learning and the student experience.

2022 Top 10 IT Issues

  • #1. Cyber Everywhere! Are We Prepared?: Developing processes and controls, institutional infrastructure, and institutional workforce skills to protect and secure data and supply-chain integrity
  • #2. Evolve or Become Extinct: Accelerating digital transformation to improve operational efficiency, agility, and institutional workforce development
  • #3. Digital Faculty for a Digital Future: Ensuring faculty have the digital fluency to provide creative, equitable, and innovative engagement for students
  • #4. Learning from COVID-19 to Build a Better Future: Using digitization and digital transformation to produce technology systems that are more student-centric and equity-minded
  • #5. The Digital versus Brick-and-Mortar Balancing Game: Creating a blended campus to provide digital and physical work and learning spaces
  • #6. From Digital Scarcity to Digital Abundance: Achieving full, equitable digital access for students by investing in connectivity, tools, and skills
  • #7. The Shrinking World of Higher Education or an Expanded Opportunity? Developing a technology-enhanced post-pandemic institutional vision and value proposition
  • #8. Weathering the Shift to the Cloud: Creating a cloud and SaaS strategy that reduces costs and maintains control
  • #9. Can We Learn from a Crisis? Creating an actionable disaster-preparation plan to capitalize on pandemic-related cultural change and investments
  • #10. Radical Creativity: Helping students prepare for the future by giving them tools and learning spaces that foster creative practices and collaborations

 Source: 2021–2022 EDUCAUSE IT Issues Panel, Susan Grajek

 


Monday, November 1, 2021

How we reached this online communication minefield

 One of the earliest judgements that looked into whether or not there was such a thing as privacy in private correspondence had involved two of the greatest literary giants of their time, on one hand, and an early inventor of the trashy novel on the other. The case was the final denouement in a long-standing feud that writers Alexander Pope and Jonathan Swift had with publisher Edmund Curll. There isn’t enough space in this column for all the gory details and events that led to the final showdown in court. Suffice to say that after a series of increasingly vicious attacks on each other, Edmund Curll got his hands on over 20 years of private correspondence between the two famed writers and published it for all to read.

Never before had a court been called upon to decide on the privacy implications of a new technology. That said, never before had a technology made such radical improvements on the existing state of communications. Thanks to printing technology, what previously took months to manually transcribe now rolled off presses in a matter of hours. As much as this resulted in the widespread dissemination of information, it also made it possible for unscrupulous persons, of the likes of Edmund Curll, to print hundreds of copies of salacious gossip and place it in the hands of people with little effort.

Technology constantly improves the way in which ideas are communicated—the speed with which they are created, the distances they travel and the audiences they reach. As much as each of these advances has improved the overall quality of knowledge in society, every iteration has resulted in progressively greater incursions into our personal space.

The postal system allowed messages to be sent further afield than was previously possible. But even though this allowed people separated by great distances to stay in touch, it increased the likelihood that what they said to one another would fall into the hands of strangers along the way. So serious was this concern that most countries criminalized the act of opening letters entrusted to the postal department by anyone other than its intended recipient.

The telegraph, the next improvement on communication technology, placed even greater stress on privacy. In order to send messages over the wires, telegraph companies had to employ operators to transcribe messages from Morse Code to English. As a result, even though the telegraph ensured that messages reached their intended recipients faster, the technology introduced novel constraints on what could be said, given that the very operation of the system required it to be read several times along the way.

Next came telephones, a technology that made it possible for individuals to speak directly with each other over long distances. In the very early days, entire neighbourhoods had to be connected using a single ‘party line’ that was used simultaneously by a number of families. While your telephone only rang when you were getting a call, it was entirely possible for you to pick up the phone and listen in on someone else’s conversation on that line. Even after individual homes were directly linked with exclusive telephone lines, calls still had to be put through by switchboard operators who could (and did) regularly listen in.

Each time a new technology is introduced to society, the novel features it has to offer are welcomed with enthusiasm. Thanks to this initial euphoria, it takes time for its effects on personal privacy to be felt. But every technology inevitably faces a societal backlash, which is usually from the upper sections of society, people who often have the most to lose if their privacy is infringed. But then, with the passage of some more time, society typically learns to adapt by adjusting the manner in which people communicate to account for constraints imposed by the new technology.

We are currently in the midst of the latest evolution in communication technology. The mobile internet has upended the way we interact, and, for most of us, the initial euphoria has begun to wear thin. Since the internet never forgets, tools like news-feeds, search and algorithmic amplification surface information that most of us would rather had remained buried. Things said over a decade ago in an entirely different context can cause all sorts of embarrassment if dredged up today.

In a recent article, writer Byrne Hobart pointed out that privacy in online communication can never be absolute. The reason we find it hard to safeguard our privacy, Hobart argues, is that “the whole point of communicating is to violate your own privacy in a controlled way".

No matter how carefully we think about what we are posting online before we hit ‘send’, since we are susceptible to the very human failing of statistical bias, chances are that sooner or later, our assessment will turn out to be wrong. Which means that we need to view the very act of engaging in online communication as a risk management exercise that requires us to balance the benefit we hope to gain against the risks we could be exposed to as a result of it.

This realization has already altered the way that many of us communicate, forcing us to be more circumspect about how we engage in conversations online, mindful of the harms that could befall us if we are careless. The vast majority, though, still appear to get caught unawares when an innocuous or offhand remark sparks an uncontrollable conflagration of public response.

Rahul Matthan is a partner at Trilegal and also has a podcast by the name Ex Machina.

Source: Mint epaper, 2/11/21