Book Your slot
X
ONLINE BOOKING
BOOK NOW
OFFLINE BOOKING
Call or WhatsApp 7993732682 (WhatsApp Now), 9177341827 (WhatsApp Now)
search
Menu Login home
  • Questions

  • Library

  • University Updates

  • Informatives

  • Technology Lines

  • Training & Internships

  • X
    Menu
  • Home
  • Privacy Policy
  • Legal Disclaimer
  • Terms & Conditions
  • Return Policy
  • About Us
  • Need any help?? write to us at

    support@engineershub.co

    Follow Us

    X
    LOGIN
    Login to access posts, links, updates, question papers, materials, one liners!
    Use Your Email Address/Mobile and Password to Login
    Forgot Password?
    Not a member? Sign Up
    LOGIN WITH EMAIL/MOBILE
    Forgot Password?
    Go Back
    FORGOT PASSWORD
    Go Back
    RESET PASSWORD
    Go Back
    Continue with LinkedIn
    OR
    Fill Up a Simple Form
    Already a Member? Login
    SIGN UP
    Fill all the below details correctly and click on Next
    Go Back
    Cloud Computing - Tag - questions - EngineersHub
    Tag results found for "Cloud Computing" in Questions.
    Question
    Ruchitha
    6 months ago
    1 Answer(s) posted Write an answer 491
    Answer
    Read Mode
    Answer posted by Akhil Kumar Lakum
    5 months ago

     Internet development trends include,

    (i) Internet of things (IOT)

    (ii) Cyber-physical systems (CPS).

    (i) Internet of Things (IOT): Internet refers to the interconnection of various devices that forms a network whereas Internet Of things is a network that connects various objects devices, tools etc., that are used in computing. These things are connected usually via sensors wirelessly because there exist certain variations in terms of their size, time and space. Most common type of sensor used to provide this kind of connectivity includes RFID and GPS technology.

    It is now possible to allocate 2%u2078 IP addresses with the advent of IPv6 that might include computers along with other devices such as mobile phones. A suggestion dictated by researchers associated with IOT is that an IOT must be capable of handing trillion objects concurrently irrespective of their types. This is because in future, each person will depend on an average of 100 to 5000 objects. Due to this, things needed to be classified universally which makes it complex. This complexity can be decreased by employing threshold value.

    The major aim of IOT is to provide communication between various things and humans irrespective of their location and time at a low cost.

    (ii) Cyber Physical Systems (CPS): The systems that provide collaboration of computational elements with the physical objects that exist in the real world is known as Cyber Physical Systems (CPS). It is typically considered as a network that provides communication between physical objects and computational things. The application areas of CPS include civil infrastructure chemical research, transport, energy, entertainment and many more.

    The concept of CPS is similar to IOT except that it involves VR (Virtual Reality) applications to be available for use with physical entities. A real time example of CPS is a robot whose movement is carried out with the help of various sensors along with navigation and wireless networking features.

    0
    Question
    Naseem Shaik
    6 months ago
    1 Answer(s) posted Write an answer 548
    Answer
    Read Mode
    Answer posted by Anvesh Kanchibhotla
    6 months ago

    Degrees of Parallelism: Degrees of parallelism is a measuring unit that defines the capability of a distributed system to run multiple programs or operations concurrently (or) in parallel. With the improvements in computing technology, DOP evolved from bit-level parallelism to Job level parallelism. Bit-level Parallelism (BLP): This type of parallelism was typically used in systems that were monolithic, expensive and were based on bits. In these types of systems bit-level parallelism is used to transform bit-level processing into word level processing.

    Instruction Level Parallelism (ILP): When processing evolved  from 4-bit to 64-bits, ILP gets employed with which more than one instructions can be processed concurrently. Some examples of ILP include multithreading, pipelining, super scalar computing etc.

    Data-level Parallelism (DLP): DLP depends on hardware and compiler support in order to carry out its work efficiently.It was employed in SIMD processors.

    Task-Level Parallelism (TLP): It came into existence with the development of CMPs (Chip Multiprocessors) and multicore processors. It is not preferred over other types of DOP, because it is complex in coding and compiling.

    Job Level Parallelism (JLP): With the development of distributed computing, granularity of processing increased in terms of processing. Therefore, DOPS is transformed in terms of Job-Level Parallelism (TLP).

    Trend Towards Utility Computing: In today's world, the global IT industry has experienced a huge emergence in the model of computing called "Cloud Computing". It is a new acceptance as a new market category. Practically, cloud computing has evolved through various phases among which grid and utility computing holds much of the significance. Utility computing is a quantifiable (metered) service which acts as a means for computational and storage resources. The functions and operations performed by this service are the same as traditional public utility companies. But, utility service is the improvised version of the traditional service as the former service is free from all impediments that incurred in the latter one. It was basically utilized by early enterprise adopters for non-mission critical need. So, later on utility computing too was extended to an advanced level called cloud computing.  This form of computing enabled the virtualisation technology and virtual servers easily accessible to the IT department. The buzz and excitement around cloud computing is considered as the emergence of a new model of computing in the IT industry. This has helped in increasing the storage flexibility, high-speed bandwidth and universal software interoperability as well as in decreasing the cost.

     

    0
    Question
    Anandam Lokesh
    7 months ago
    1 Answer(s) posted Write an answer 702
    Answer
    Read Mode
    Answer posted by KURVA SHEKER
    7 months ago

    GPU stands for Graphics Processing Unit which is used to manipulate 3D graphics, multimedia and images. The major aim of this concept is to free up the processor from processing tasks associated with graphics by handling these tasks in the graphic card itself. This can be done by implementing GPU as a coprocessor on the video card.

    It was first developed by NVIDIA in 1999 which was named as GeForce 256 that is capable of handling 10 million polygons within a single second. This feature was made to be used in almost all computers today. It is designed in such a way that it processes multiple threads simultaneously providing massive parallelism. Modern GPUs are capable of processing 1024 concurrent threads. They highly depend on providing increased throughput at chip-level.z

    With improvements in GPU technology, they are used in processing floating point operations and data-intensive calculations apart from processing graphics. For this reason they are now used in mobile phones, gaming consoles, personal computers and many other fields.

    Fermi GPU: Fermi based GPU has the following advantages,

    1.It has improved memory access and double precious floating performance.

    2.It supports ECC.

    3. It generates cache hierarchy.

    4. It shares memory among streaming.

    5.It performs faster context switching, atomic operation and instruction scheduling.

    6.It uses a prediction method inorder to reduce branch penalty.

    Fermi GPU consists of the following components,

    1. 3.0 billion transistors. 

    2. 512 cores arranged on 16 bit stream multiprocessor of 32 cores each which in turn are shared by L, cache. The function of these core is to execute floating point or integer instructions per clock.

    3. 384 bit (i.es 6X64) DRAM interface is provided by GPU chip for supporting a total of 6 GB memory.

    4. PCI express (host interface) in order to connect GPU to CPU.

    5. Giga Thread Unit (GT) to schedule a group of thread among Streaming Multiprocessor (SM).

           In addition to 32-cores, Stream Multiprocessor (SM) also consists of 16 load/store unit and four independent Special Functional Unit (SFU) in order to perform mathematical functions like sine, cosine, reciprocal and square root. The 32 core inturn are provided with Arithmetic Logical Unit (ALU) and Floating Point Units (FLU's).

    17
    Question
    Ruchitha
    7 months ago
    1 Answer(s) posted Write an answer 773
    Answer
    Read Mode
    Answer posted by Venu Gopal reddy Seelam
    7 months ago

    HPC computing:

    •  It stands for high performance cloud computing which improves the performance by computing by increasing the speed.
    • The main motivation behind such computing is improvement in various fields such as engineering and scientific etc..,  which made the speed measure over p-flops and g-flops.
    • Modern computing involve desktops or pc connected over LANs and WANs where computing must provide efficient ways to make servers transparent along with faster access.

    HTC computing:

    • It stands for high throughput computing which focuses on providing concurrent access to the data present over the internet over millions of devices.
    • It can be defined as the rate at which the task is carried out.
    • Rate can be increased with respect to the number of tasks while decreasing the time. Apart from speeding up the processing it also has the feature to capture the issues associated with cost, security and reliability .
    18
    Question
    Anvesh Kanchibhotla
    7 months ago
    1 Answer(s) posted Write an answer 1846
    Answer
    Read Mode
    Answer posted by Anandam Lokesh
    7 months ago

    There are different types of computing paradigms

    1.       Centralized or monolithic computing:

    The computing done using a single computer, which is not part of any network is referred to as monolithic computing.In this type of computing the system makes use of only those resources which are in its access.Since there is only one user who is using a system at a given instance of time therefore the computing is called a single user monolithic computing. For instance, if a user is using Ms-Excel application, then it implies that he/she is performing monolithic computing.However, it is even possible for multiple users to share the same system simultaneously using the concept of time sharing.This technique enables multiple users to share resources of a single computer parallelly  based on predefined criteria.Such form of computing is referred to as multiple user monolithic computing. This sort of computing can be carried by using a mainframe computer that provides centralized resources. Multiple users can establish connections with this computer using a device known as "terminal", so as to enable interaction between the users during terminal session. The application that falls under this category of computing is generating payroll.

    2.       Parallel computing:

    The computing wherein a single program is executed by multiple processors simultaneously referred to as parallel computing. The major advantage of parallel computing is that the execution of a program is done at very high speed. However, the difficulty of such computing lies in dividing a single program using multiple processors without interfering with one another. This sort of computing is generally performed on a computer that contains more than one process. However, it can also be performed by connecting computers into a network This computing is specially used in areas such as weather forecasting and semiconductor design. Beside this, parallel computing is also used to solve Such problems which cannot be solved by a single computer.

     3.       Distributed Computing:

    In the distributed computing model, the processing is done in multiple computers that are  connected in the same networks. Each of these computers have their own processors in addition to other resources.  

      ll the workstation, i.e., nodes have complete access to the resources of the local computer to which it is connected. When there is an interaction between local computers and remote computers, the users will be able to access the resources of remote computers as well. A best example of distributed computing is the World Wide Web'. To visit a website a user makes use of a browser for example Internet Explorer, Mozilla Firefox or Netscape Navigator etc. They will run on the local system of the user. They will also interact with another program running on a remote system to search for a required file. This file may reside on any other remote system. 

    4.       Cloud Computing: 

      Cloud computing can be viewed as a model for distributing information technology. Inorder to gain access to the resources from Internet without depending on direct connection with the server. The model can easily retrieve resources via web-based tools and applications. Here, the information which is to be accessed is stored in clouds and it gives the privileged to the user to access the information whenever and from where ever they want. Thereby, allowing the users to work remotely. In general cloud computing is nothing but the use of computing resources such as hardware and software which are distributed as a service across the network. It centralises the data storage, processing and bandwidth which in turn provides efficient computing process to the users.

    85

    Users Joined

    19-4A2
    7 hours ago
    Sudha
    7 hours ago
    JOE
    10 hours ago
    sameer
    11 hours ago
    arunkumarneelam
    12 hours ago
    EngineersHub Logo
    x
    Loading...