Categories
Articles

DIGITALIZATION TRANSFORMS IT

Indeed, digitalization has been a transformative force in the field of Information Technology (IT) over the past few decades. It has revolutionized the way businesses operate, how people interact, and how information is accessed and processed.

This metamorphosis is evident in the widespread adoption of technologies that automate processes, enhance efficiency, and improve overall business operations. The integration of cloud computing has revolutized IT infrastructure, providing scalability and flexibility to organisations. Data Analytics and business intelligence, have become integral, empowering decision-makers with actionable insights derived from vast datasets.

IT systems have evolved from traditional, localized setups to interconnected and cloud-based ecosystems that enable seamless data sharing, real-time collaboration, and scalable infrastructure. The convergence of artificial intelligence, big data analytics, and the Internet of Things has led to unprecedented insights, efficiencies, and innovation. However, this rapid digital transformation also brings forth new challenges, such as heightened cybersecurity risks and the need for continuous skill development. In this dynamic landscape, organizations must embrace agility, adaptability, and a forward-thinking mindset to harness the full potential of digitalization and stay competitive in an ever-evolving technological landscape.

Organizations now have the ability to gather and analyse vast amounts of data to gain insights, enhance customer experiences, and streamline processes. However, this transformation also brings challenges, such as cybersecurity risks and the need for upskilling the workforce. Overall, digitalization has revolutionized IT, offering opportunities for efficiency, growth, and innovation while demanding a proactive approach to navigate its complexities. This transformation not only enhances efficiency and scalability but also empowers businesses to harness the power of data analytics, artificial intelligence, and machine learning for more informed decision-making.

Furthermore, digitalization facilitates a more agile and responsive IT environment, enabling rapid adaptation to changing market dynamics and customer preferences. However, this shift also brings forth new challenges, particularly in terms of cybersecurity and data privacy, necessitating a comprehensive approach to risk management. In essence, the ongoing process of digitalization is revolutionizing IT, enabling organizations to unlock unprecedented opportunities while navigating the complexities of a digitized world.

The rise of the Internet of Things (IoT) has ushered in an era of connected devices, enabling real-time data collection and smart environments. Customer experiences have been redefined through personalization and the proliferation of mobile technologies. Agile development practices and DevOps methodologies have accelerated software deployment, fostering a culture of continuous improvement. In essence, digitalization has not only modernized IT processes but has also instilled a culture of innovation, positioning businesses to thrive in the rapidly changing technological landscape.

Categories
Articles

ETHICAL CONSIDERATION IN ARTIFICIAL INTELLIGENCE

The rapid advancement of artificial intelligence (AI) has presented numerous benefits to society, from improved healthcare and transportation to enhanced communication and entertainment. However, as AI continues to be integrated into various aspects of our lives, it is crucial to consider the ethical implications of its development and deployment.

One of the most significant ethical considerations in AI is the potential for biases to emerge in AI algorithms. Biases can arise in AI models due to biased training data, which can have harmful implications for fairness and equity. For example, if an AI algorithm is trained on data that is biased against a particular group, such as people of colour or women, it may perpetuate that bias in its decision-making. This could result in unfair treatment and discrimination.

To address this issue, it is essential to ensure that AI training data is diverse and representative of all populations. Additionally, AI developers must continually monitor and test their algorithms for potential biases and take corrective action if necessary. This requires a commitment to transparency and accountability in AI decision-making processes, which can be challenging to achieve.

Another critical ethical consideration in AI is the need for explainability in complex AI models. As AI becomes more sophisticated and complex, it can be difficult to understand how it arrives at its decisions. This lack of transparency can be problematic, particularly in high-risk applications such as healthcare or criminal justice. For example, if an AI algorithm is used to determine a person’s eligibility for a loan or job, it is essential to understand how the algorithm arrived at that decision to ensure that it is fair and unbiased.

To address this issue, researchers are exploring ways to develop explainable AI (XAI) that can provide insight into how AI models arrive at their decisions. XAI can help ensure that AI is transparent, accountable, and fair, which is crucial for building trust in AI systems.

AI also raises concerns about the impact on jobs. While AI has the potential to create new jobs and enhance productivity, it may also result in job displacement as machines replace human labor. This could have significant implications for the workforce and the economy.

To address this issue, policymakers and organizations must invest in reskilling and upskilling programs to help workers transition to new roles. Additionally, it is essential to ensure that AI is developed and deployed in a way that benefits society as a whole, rather than just a select few.

Finally, there are concerns about the use of AI for surveillance and the potential for infringing on individuals’ privacy rights. As AI becomes more integrated into our daily lives, it could be used to monitor our behavior, track our movements, and collect data about us without our knowledge or consent.

To address this issue, policymakers must establish clear guidelines and regulations around the use of AI for surveillance. Additionally, organizations must prioritize data privacy and implement robust security measures to protect individuals’ information.

In conclusion, AI presents numerous benefits and opportunities for society, but it is essential to consider the ethical implications of its development and deployment. To ensure that AI is used responsibly and for the benefit of all, it is crucial to prioritize transparency, accountability, and fairness in AI decision-making processes. This requires a collaborative effort between tech companies, policymakers, and society as a whole to establish ethical frameworks and guidelines for AI development and deployment.

Categories
Articles

THE RISE OF EDGE COMPUTING

In today’s fast-paced and interconnected world, the demand for real-time data processing has never been greater. With the continuous development of IoT devices, autonomous vehicles, and other connected technologies, the need for instant data analysis and decision-making has become a critical aspect of many industries. This is where edge computing comes into play.

Edge computing, also known as edge analytics or fog computing, is a decentralized computing infrastructure that brings data processing closer to the source of generation, rather than relying on centralized cloud servers. In simple terms, it involves moving the processing power closer to the edge of the network, closer to where the data is being generated.

The rise of edge computing can be attributed to several factors. One of the main driving forces behind its adoption is the need to address the latency issues faced by applications that require real-time data processing. With traditional cloud computing, data is sent to centralized servers, which can introduce delays due to the distance the data needs to travel. This latency can be a significant problem for applications that require immediate responses, such as autonomous vehicles that need to make split-second decisions based on sensor data.

Edge computing solves this problem by processing the data closer to where it is being generated, reducing the latency. By doing so, it enables real-time analytics and decision-making, allowing applications to respond quickly and efficiently. This is especially crucial in scenarios where even a slight delay can have severe consequences, such as autonomous vehicles or critical industrial processes.

Another advantage of edge computing is its ability to handle massive amounts of data at the edge of the network. With the proliferation of IoT devices, there has been a significant increase in the volume of data being generated. Sending all of this data to centralized cloud servers for processing and analysis can be impractical and expensive. Edge computing offers a more cost-effective solution by processing the data locally, only sending relevant information to the cloud for further analysis or storage. This reduces bandwidth consumption and allows organizations to optimize their infrastructure.

Furthermore, edge computing enhances data privacy and security. By processing data locally, sensitive information can be kept within the confines of a specific location or device, reducing the risk of data breaches or unauthorized access. This becomes increasingly important as data privacy regulations become stricter and the value of personal and sensitive information grows.

The adoption of edge computing has already made a significant impact in various industries. For example, in the healthcare sector, edge devices like wearable fitness trackers can analyze health data in real time, optimizing machine performance, and reducing downtime.

In conclusion, the rise of edge computing signifies a paradigm shift in how data is processed and analyzed in the tech industry. By bringing the processing power closer to the edge of the network, edge computing addresses the latency issues faced by applications that require real-time data processing. It enables faster response times, reduces bandwidth consumption, enhances data privacy and security, and opens up new possibilities for innovation in various industries. As the demand for real-time data processing continues to grow, edge computing is poised to play an increasingly vital role in shaping the future of technology.

Categories
Blog

Use of Information Technology to Control Machinery

In accounting and business, the utilization of IT to control operations involves specific activities performed by systems or individuals, who are trained and designed to ensure the achievement of the business’s objectives and goals. They are a part of the enterprise’s internal control. The intention of the information technology control relates to the integrity, availability, and confidentiality of the data and the overall management of the business enterprises IT functions.

These controls are typically categorized into two groups: IT application controls and IT general controls, commonly referred to as ITGC. The ITGC function can control over computer operations, program changes, program development, IT environment, and access to data and programs. Let’s learn more about these controls down below.

What Does IT Controls Refers To?

Control Objectives for Information TechnologyThe IT application controls usually refer to the transaction processing controls, which are sometimes called the input-processing-output controls. IT controls have been provided with an increased prominence in corporations, which are listed in the United States.

The COBIT Framework, or Control Objectives for Information Technology, is a popularly used framework that is declared by the Information Technology Governance Institute, which defines all kinds of application control and ITGC objectives and recommended evaluation approach. The organization’s IT departments are usually led by a CIO or Chief Information Officer, who’s responsible for ensuring the effectiveness of IT controls that are used.

The Information technology General Controls (ITGC)

IT general control represents the foundation of the structure of IT control. It helps ensure that the reliability of the data that is generated by the IT systems, as well as supporting the assertion which the systems operate as it was intended and that its output is reliable.

Typically, ITGC includes the following types of controls:

  • Document Version Control/Source Code Procedures: it is a kind of control that is designed to protect the program code’s integrity.
  • Control Environment, or controls that are designed to change the corporation’s culture, or “tone it to the top.”
  • Logical Access Standards, Processes, and Policies: it is a control that is designed to manage the access, depending on the business’ needs.
  • Standards of the Software Development Life Cycle: it is designed to ensure that the IT projects have been effectively managed.
  • Change the Management Procedures: the controls are designed to ensure that the changes have met the requirements of a business and has been authorized.
  • Physical Security: it is a control that ensures the physical security of the information technology from environmental risks, as well as from individuals.
  • Technical Support Procedures and Policies: this policy is to help the users perform accurately and be able to report problems.
  • Disaster Recovery/Recovery and Backup: it is a procedure that enables to continue the process despite the adverse conditions.
  • Problem Management Procedures and Policies: it is a control that is designed to address and identify the root cause of the problems.
  • Incident Management Procedures and Policies: it is a control that is designed to address all the operational processing errors.
  • Software/Hardware testing, configuration, installation, policies, management standards, and procedures.

The Information Technology Application Controls

The IT application also called the program controls, are designed to be fully automated to ensure the accurate and complete processing of data from the input and into the output. The controls may vary depending on the business purpose of the particular application. These controls could also help ensure the security and privacy of data that is transmitted between the applications.

Categories of Information Technology Application Controls may encompass:

  • information technology controlsIdentification- a control that ensures all the users are irrefutably and uniquely identified.
  • Completeness Check- a control that would ensure all the records have been processed from initiation up to completion.
  • Authentication- it is a control that provides an authentication mechanism in the system application.
  • Input Controls- it is a control that ensures the integrity of data that is fed from upstream sources, and then into the application system.
  • Validity Checks- a control that makes sure only the valid data is processed or input.
  • Forensic Controls- a control that makes sure the data is scientifically and mathematically correct based on the outputs and inputs
  • Authorization- a control that ensures only the approved business users will have access to the application system.

The Internal Control Frameworks

In this section of the article, we would be learning about the basic knowledge about COSO and COBIT, which are both part of the internal control frameworks. We will first start off with COBIT:

Control Objectives for Information Technology (COBIT)

The COBIT is a widely used framework that contains the best practices for both application and ITGC controls. It consists of processes and domains. The basic structure will indicate that the IT processes a satisfying business requirement, which would then be enabled by a particular IT control activity. In addition, it also recommends the best method and practices of the enterprises IT controls evaluation.

Committee of Sponsoring Organizations of the Treadway Commission (COSO)

The COSO identifies about five components of the internal control, it includes risk assessment, control environment, information and communication, control activities, and monitoring. All of these components should be in place to accomplish the disclosure objectives and financial reporting. The COBIT provides the same detailed guidance for IT, while the related Val IT would concentrate on a higher level of IT governance, as well as the value-for-money issues.

COSO’s five components can be visualized like the three-dimensional cube’s horizontal layers, and the COBIT objective domains will apply to individuals and in accumulation. The COBIT’s four major domains are acquired and implement, monitor and evaluate, plan and organize, and deliver and support.

Conclusions:

Aside from the given facts and descriptions above, there are to information technology in controlling machines. And since technology is basically taking over our world today, controlling machines with information technology will never get old. With technology today. The previous issues of slow intervals, complex systems, and overwhelming data storage have become relics of the past.

The current technology is capable of transforming and enhancing the internal operations of organizations.

Categories
Blog

What is the Information Technology Audit?

An information system audit or IT Audit is an analysis of the management controls, which is within the infrastructure of information technology. The assessment of the retrieved evidence will determine if the information systems are operating effectively, maintaining the integrity of data, and safeguarding the assets to achieve the company’s objectives or goals.

The following reviews may be performed in combination with the internal audit, financial statement audit, or in other forms of the attestation engagement.

What is the Purpose of IT Audits?

purpose of IT AuditsAs information technology is increasingly used by governments, consumers, and businesses, its prominence has grown alongside our dependence on these systems. In a macro level, our international trade, national commerce, and government operations have come to depend more and more on technologies. And over the past years, government agencies such as the state, federal, and local have spent more than $100 million just for the IT systems. And if you add in the information technology expenditures and commercial enterprises, it would easily reach about $1 billion.

Given that most organizations rely on IT processes and systems, ensuring confidence in the operating systems and trusting the system’s output has become crucial. But the best way to ensure that the systems are reliable is to measure its impact, inspect on the systems, and make a report on the findings. This is the purpose of IT audits, and their reliable role will continue to grow in all companies and organizations, as the need for privacy, security, and confidentiality increases.

With an understanding of the growing importance of the technology, the federal government, as well as most states, have made positions of Chief Information Officer (CIO), which would specifically be in charge of carrying the IT strategies of the company or organization. A critical aspect of this strategy involves developing standards and requirements for the use and creation of IT systems, which serve as the guidelines for IT Audits.

How are the IT Audits Performed?

The IT audits are usually operated by a specially trained or certified personnel. For those who perform the reviews, also known as auditors, could either be external personnel who often the audits as a kind of service or as an internal staff who are called by the company to do the auditing. And as what was noted above, an IT audit could either be a part of an all-surrounding organization-wide audit or just the part of the IT systems.

 IT Audits PerformedFurthermore, the IT audit can also be broken down into smaller evaluations, and only specific operations or systems within the IT organization would be inspected. Regardless of the scope of the audit or the responsible party, all IT audits adhere to well-documented and stringent processes and procedures to ensure comprehensive coverage.

Using checklists and guidelines as aids, auditors evaluate the controls, processes, technology, and personnel encompassed within the audit scope. During the review, the auditors would evaluate compliance with government regulation and/or organization policies, along with the identification of the risks from non-agreements. They’ll also assess the inefficiencies in the procedures, IT systems, and processes, as well as recommend a list of steps to help minimize the risks and correct some of the sub-Par performances.

What are the Types of IT Audits?

As an information system auditor, you’ll be performing a lot of various types of audit. But the first thing that you need to do is to generally understand auditing. The following below are the three types of auditing.

  • Internal Audit

Internal audit is the kind of audit where you’ll be focusing on the evaluation of the risk management process, governance processes, and control environment in an organization. And as an internal IT auditor, you would be a part of the organization. However, the structure of your reporting should mainly be up to the highest level of management, as this would ensure the objectivity and independence of the audit function.

  • External Audit

Unlike the internal auditors, the external auditors are complete of the structure of the management, but their functions are still the same in evaluating the control structure, governance process, and risk management. The auditors enjoy their full independence as they’re not reporting to the management regarding their function. This is also mostly mandated by the law.

  • Third Party Audit

A third-party audit is often jointly contracted by two or more parties to ensure that common agreements or functions are being adhered to and that they function as intended.

5 Types of Information System Audits

Types of Information System AuditsAs an information system auditor, you’ll be conducting various types of audits. Therefore, you might as well work as a company employee and implement independent audits, or work as an external auditor. It’s also possible that you’ll be committed to performing as a third party auditor.

Since the field of information systems are so vast, your work will mainly fall to some of the following sub-types of Information System Audits below:

  1. General Controls Audit

Your work will mainly be reviewing the usually accepted controls on all the implementations of the information system. This could involve systems operation, application security, systems development, and maintenance systems.

  2. Application Controls Audit

This is a type of auditing that is focused on a specific application. It might revolve in evaluating the processing, input, and output controls of the specific software or application.

  3. Systems Development Audit

This type of auditing will usually focus on systems development or software. You’ll mostly audit the processes of the system development, which ranges from gathering the requirement for the final product in the production systems.

  4. Forensic Audit

You may be asked to perform an audit for a specific system after the suspicious and unusual activity that is reported and observed.

  5. Integrated Audit

This kind of audit will involve in working with other teams or auditors like performance or financial auditors.

Conclusion:

Working as an Information Technology Auditor is a multi-dimensional and very challenging job. However, it could also be the best choice that you could ever make as computer science or IT major, as it’s one of the jobs that has a good salary and great working environment.

Categories
Blog

Cloud Computing

Cloud computing is a paradigm-shifting technology that has revolutionized the way businesses and individuals manage and utilize computing resources. At its core, cloud computing refers to the delivery of various services – including storage, processing power, and software – over the Internet. Instead of relying solely on local hardware and software, users can tap into a network of remote servers hosted in data centres around the world.

Why Choose Cloud Computing?

Seamless Scalability: With our cloud-based services, you can effortlessly scale your resources up or down to meet fluctuating demands, ensuring optimal performance during peak times while reducing costs during lulls.

Enhanced Security: Protect your sensitive data and confidential information with state-of-the-art security measures, stringent access controls, and encrypted data storage offered by our advanced cloud infrastructure.

Unmatched Flexibility: Our cloud solutions empower your team to work seamlessly from anywhere, promoting remote collaboration, and productivity in the modern work environment.

Cost Efficiency: Bid farewell to costly on-premise hardware and maintenance expenses, as cloud computing eliminates the need for physical infrastructure and offers a cost-effective pay-as-you-go model.

Unrivalled Performance: Experience lightning-fast data processing and application performance, thanks to our robust cloud architecture equipped with the latest technologies.

SEO-friendly Keywords: Cloud computing, cloud-based services, scalability, enhanced security, flexible remote work, cost efficiency, cloud architecture, innovative solutions, cloud migration, efficiency gains.

Unleash the true potential of your business with the power of cloud computing. Partner with us today and experience a seamless transformation into the future of digital innovation.

One of the key benefits of cloud computing is its scalability. Traditional IT infrastructure often requires significant upfront investments to accommodate potential growth. In contrast, cloud services allow users to scale their resources up or down based on demand, making it a cost-effective solution for businesses with varying computing needs. This elasticity also promotes agility, enabling organizations to quickly adapt to market changes and deploy new applications without the delays associated with procuring physical hardware.

Security and data privacy are critical concerns in cloud computing. Cloud service providers implement robust security measures to safeguard data, often employing encryption and multi-factor authentication. However, since data is stored offsite, some individuals and organizations remain cautious about potential breaches. It’s essential for users to select reputable providers and implement additional security measures to mitigate risks.

Different models of cloud computing exist: Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS). IaaS provides virtualized computing resources over the internet, allowing users to manage their own operating systems and applications. PaaS offers a platform and environment for developers to build, deploy, and manage applications without worrying about underlying infrastructure. SaaS delivers software applications directly to users over the internet, eliminating the need for local installations.

Cloud computing has democratized technology access, enabling startups and smaller businesses to access enterprise-level resources without exorbitant costs. Collaboration and remote work have also been greatly facilitated, as cloud-based tools allow teams to work together on documents and projects from different locations.

In conclusion, cloud computing has transformed the way technology is consumed and has become an integral part of modern business operations. Its flexibility, scalability, and potential for innovation have positioned it as a fundamental driver of the digital age, with ongoing advancements continuing to shape its evolution and impact.

Categories
Blog

Big Data

Big data has emerged as a transformative force in the modern digital landscape, reshaping the way businesses, researchers, and individuals interact with information. It refers to the massive and complex sets of data that exceed the processing capacity of traditional database systems. These datasets are characterised by their volume, velocity, and veracity, collectively known as the 4V’s. With the advent of advanced technologies, such as the Internet of Things(IoT), social media, and sensor networks, the accumulation of data has surged exponentially.

Big Data holds immense potential, as it follows organisationto derive valuable insights, make informed decisions, and uncover patterns that were previously inaccessible.However, the challenges of managing, analyzing, and securing such colossal datasets are not to be underestimated. As the world continues to generate data at an unprecedented rate, the effective harnessing of big data has become a key driver of innovation and competitiveness across industries, paving the way for smarter strategies and a deeper understanding of the intricate workings of our interconnected world.

With its foundation in the accumulation of vast and diverse datasets, big data transcends conventional data processing capabilities, presenting an array of opportunities and challenges. The exponential growth in data volume, propelled by sources like social media, IoT devices, and scientific experiments, underscores the magnitude of this phenomenon. Its multifaceted nature, encompassing structured and unstructured data in the form of text, images, videos, and more, amplifies its relevance across industries.

The dynamism of real-time processing enables timely decision-making, particularly in sectors reliant on instantaneous responses. Yet, big data’s true value lies not only in its volume but also in the potential for predictive analytics, where historical data shapes forecasts that guide future actions. However, amidst these virtues, the specter of data quality and security looms large. Ensuring the veracity of information and safeguarding sensitive data remain persistent concerns. Big data reverberates through healthcare, commerce, research, and governance, igniting innovations like personalized medicine, tailored customer experiences, scientific breakthroughs, and smart city initiatives. As our capacity to harness, process, and derive meaning from big data evolves, so too does its power to reshape the landscape of human understanding, decision-making, and progress.

Certainly, here are three key advantages of big data:

Data-Driven Decision Making: Big data analytics enables organizations to make informed decisions based on evidence rather than intuition. By analyzing large and diverse datasets, businesses can identify trends, patterns, and correlations that provide valuable insights into customer behavior, market trends, and operational efficiency. This leads to more accurate forecasting, targeted marketing campaigns, and optimized resource allocation.

Innovation and Research: Big data has revolutionized research and innovation across various fields. In healthcare, for instance, analyzing large datasets can lead to breakthroughs in disease detection, drug development, and personalized treatment plans. In scientific research, big data allows scientists to model complex systems and simulate scenarios that were previously unfeasible, accelerating advancements in fields like climate science, genomics, and particle physics.

Enhanced Customer Experience: Big data enables businesses to gain a deeper understanding of their customers and tailor their products and services to individual preferences. By analyzing customer interactions, purchase histories, and feedback, companies can create personalized experiences that resonate with their target audience. This leads to increased customer satisfaction, loyalty, and ultimately, higher retention rates.

Here are two additional examples of how big data has contributed to the growth of companies:

Airbnb: Personalized Recommendations and Pricing Optimization: Airbnb, the global hospitality marketplace, relies heavily on big data to enhance its user experience and drive growth. The company employs sophisticated algorithms to analyse user behaviour, preferences, and historical booking data. This enables Airbnb to provide personalized accommodation recommendations to users, increasing the likelihood of successful bookings.

Zara: Agile Supply Chain and Trend Forecasting: Zara, a fast-fashion retailer, has used big data to revolutionize its supply chain and stay ahead of fashion trends. The company gathers data from various sources, including social media, online searches, and sales data, to anticipate fashion trends and consumer preferences.

Categories
Blog

Basic Internet and Computer Terms Everyone Should Know

As a sort of follow up to my last blog post about common IT problems and solutions, I thought it might be helpful to have a quick rundown of some basic Internet and computer terms and knowledge.

As an IT company, we often get calls about IT issues that are difficult for both us and the client. And it’s not because it’s a complicated issue – it’s because the client does not know how to properly describe their problem.

Categories
Blog

Stop Stressing: Easy Solutions to IT Problems

With the digitization and automation of today’s business, technology and knowing how to navigate it is becoming a necessary skill. Perhaps an IT company such as ourselves shouldn’t be giving out DIY solutions, but we are committed to making our customers’ lives as easy as possible. And sometimes, it’s not calling your IT team – it’s just restarting your computer.

These technology tips will hopefully help the people out there who aren’t as tech-savvy, or simply don’t have the time to start searching for answers and end up slumped dejectedly at their computer. So here are some go-to ideas when your computer is acting up.

Categories
Blog Cyber Security

Navigating the Internet of Things

Forbes recently published an article about how the Internet of Things contributes to our safety. However, I found the article to be somewhat one-dimensional.For those who don’t know, the Internet of Things (IoT) is defined broadly as “the network of physical devices, vehicles, home appliances, and other items embedded with electronics, software, sensors, actuators, and connectivity which enables these things to connect, collect and exchange data” (Wikipedia).

The definition may be a mouthful, but in essence, it signifies that IoT involves everyday objects connected to the internet. Generally, any device prefixed with the “smart” is part of this category. We see this everywhere – smart fridges that can tell you when to restock, smart homes that are fully controlled with your smart phone, smart cars that can navigate and (soon) drive and park for you, and countless other things.