IT Concept Terminology II

IT Concept Terminology II

11th Grade

13 Qs

quiz-placeholder

Similar activities

OCR Computing GCSE Starter 1

OCR Computing GCSE Starter 1

10th - 11th Grade

10 Qs

Decimal and Hexadecimal Conversions

Decimal and Hexadecimal Conversions

11th - 12th Grade

10 Qs

GCSE 03 Data Representation Pt1 (Number Bases)

GCSE 03 Data Representation Pt1 (Number Bases)

10th - 11th Grade

11 Qs

Number System Conversions 1

Number System Conversions 1

9th - 12th Grade

10 Qs

Computer Basics - Principles of Info Tech (Ch 1)

Computer Basics - Principles of Info Tech (Ch 1)

9th - 12th Grade

17 Qs

Hexadecimal Conversions

Hexadecimal Conversions

11th Grade

10 Qs

GCSE Computing 1.1 - 1.3 Revision

GCSE Computing 1.1 - 1.3 Revision

10th - 11th Grade

17 Qs

CompTIA,4,5,6

CompTIA,4,5,6

11th Grade

12 Qs

IT Concept Terminology II

IT Concept Terminology II

Assessment

Quiz

Computers

11th Grade

Medium

Created by

Sandra Battle

Used 13+ times

FREE Resource

13 questions

Show all answers

1.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which one of the following notational systems that is commonly used in the field of computing relies on the use of only 1s and 0s?

Unicode

Hexadecimal

Binary

ASCII

Answer explanation

Binary is the foundational notational system in computing, representing data using only two symbols, 0 and 1. It is widely used for low-level operations and storage representation. Binary is essential for understanding how computers process and store information. Unicode is a character encoding standard that aims to represent every character from every writing system in the world. It supports a vast range of characters and is commonly used in modern computing for multilingual applications. Similar to ASCII, Unicode is primarily focused on character representation and not as a general-purpose notational system. ASCII (American Standard Code for Information Interchange) is a character encoding scheme that assigns unique numeric codes to represent characters. It is widely used in early computing systems and still has significance today. However, ASCII is primarily used for character representation and not as a general-purpose notational system. Hexadecimal is commonly used in computing as a more compact and human-friendly representation of binary. It uses a base-16 system, with digits ranging from 0 to 9 and letters A to F, to represent data. While hexadecimal is commonly used for various purposes, it is not the only notational system used in computing.

2.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which among the following units is commonly used to measure digital storage?

Gigabyetes

Bits per Second (BPS)

Hertz (HZ)

Pixel

Answer explanation

Gigabyte (GB) is a unit of digital information that is used to measure storage capacity in computing. One gigabyte can store approximately 1 billion bytes, or more precisely 1,073,741,824 bytes due to the binary system used in most storage devices. Bits per second (bps) transmission speed is typically used in network environments. It measures the rate of data transfer, not storage capacity. Pixel is a unit used in digital imaging to represent a single point in a raster image, not a unit of storage. Hertz is a measure of frequency (cycles per second) and is typically used to describe processing speed or the speed of alternating current, not storage capacity.

3.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which data representation method is a character encoding standard that represents text and control characters in computers, communication equipment, and other devices, using 7 bits to define each character?

Hexadecimal

Binary

Unicode

ASCII Code

Answer explanation

SCII (American Standard Code for Information Interchange) is a character encoding standard that represents text and control characters in computers, communication equipment, and other devices. It uses 7 bits to define each character, making it the correct choice for this question. Hexadecimal is a base-16 numbering system that uses digits from 0-9 and letters A-F to represent values. It is often employed in programming and digital communication, but it is not a character encoding standard. While Unicode is a character encoding standard that accommodates a wide set of characters and symbols from various languages, it is not the correct choice for this question because it does not use 7 bits to define each character. Unicode is much more extensive than ASCII but is not specifically 7-bit based. Binary is a base-2 numbering system that employs only two base digits (0 and 1) and represents the fundamental building blocks of digital computers. It is not a character encoding standard.

4.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

In normal computer operation, a monitor primarily serves which of the following functions?

Output

Storage

Processing

Input

Answer explanation

A monitor is an output device. It displays data or information processed by the computer in a form that is usable and understandable by the user. The function of an input is to enter data into a computer for processing. A device like a keyboard or a mouse typically serve this function, not a monitor. Storage is the function of saving and keeping data and instructions for future use. Storage devices include hard drives and solid-state drives, not monitors. The monitor only displays data; it doesn't store it. Processing is the operation that manipulates data as directed by a sequence of instructions. It is handled by the computer's CPU (Central Processing Unit), not the monitor.

5.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

Which of the following data types does NOT have the ability to represent multiple characters in a sequence?

Boolean

Numbers

Strings

Array

Answer explanation

Booleans cannot represent sequences of characters. Their sole purpose is to represent true or false values. The main function of the Strings data type is to store sequences of characters. It can hold either a single character or multiple characters. While it's true that the primary purpose of the Numbers data type is to store numerical values, they can also be used to represent sequences of numerical characters. In most programming languages, an array can hold multiple values, including characters, in a sequence. An array can be composed of multiple elements, each of which can be a character, effectively creating a sequence of characters.

6.

MULTIPLE SELECT QUESTION

45 sec • 1 pt

In normal computer operation, a keyboard primarily serves which of the following functions?

Input

Output

Storage

Processing

Answer explanation

Explanation

A keyboard is primarily used to input data into the computer system. By pressing letters, numbers, and other special characters, users can send information to the system for processing. The function of processing primarily includes performing actions on the provided data or information. This function is taken care of by the central processing unit (CPU) and not by a keyboard. The function of storage involves retaining data for future use. While keyboards do send inputs to an application in memory (RAM) or storage (hard drive, SSD), they don't themselves serve a primary role as a storage device. Devices like Hard Disk Drives (HDD), Solid State Drives (SSD), and memory sticks are examples of storage devices. Output is incorrect because the keyboard does not display the results of the data processed. Output devices include monitors, printers, speakers, etc., which display or produce the results after the processing of the input data.

7.

MULTIPLE CHOICE QUESTION

30 sec • 1 pt

What is the equivalent speed of 1 Gigahertz (GHz) in Megahertz (MHz)?

1024 MHz

1000 Mhz

500 Mhz

1 Mhz

Answer explanation

Since processing units use the decimal system (base 10), when converting frequencies, 1 Gigahertz is equal to 1000 Megahertz. 500 MHz is half of the required amount. Since processing units use the decimal system (base 10). When converting frequencies, 1 Gigabit per second (Gbps) is equal to 1000 Megabits per second (Mbps). The relationship between Gigahertz and Megahertz is not 1:1. Since processing units use the decimal system (base 10) when converting frequencies, 1 Gigabit per second (Gbps) is equal to 1000 Megabits per second (Mbps). While in digital storage calculations, we usually use the binary system (base 2), where 1 Gigabyte would equal 1024 Megabytes, when converting frequencies, we use the decimal system (base 10), where 1 Gigabit equals 1000 Megabits.

Create a free account and access millions of resources

Create resources
Host any resource
Get auto-graded reports
or continue with
Microsoft
Apple
Others
By signing up, you agree to our Terms of Service & Privacy Policy
Already have an account?