bet365娱乐, bet365体育赛事, bet365投注入口, bet365亚洲, bet365在线登录, bet365专家推荐, bet365开户

    WIRED
    Search
    Search

    Computer Scientist Answers Computer Questions From Twitter

    Professor and computer scientist David J. Malan joins WIRED to answer your computer and programming questions from Twitter. How do search engines compile information so quickly? Which operating system is best? How do microchips work?

    Released on 08/29/2023

    Transcript

    Hello world.

    My name is Professor David J. Malan,

    I teach computer science at Harvard,

    and I'm here today to answer your questions from Twitter.

    This is Computer bet365体育赛事 Support.

    [upbeat music]

    First up from tadproletarian,

    How do search engines work so fast?

    Well, the short answer really is distributed computing,

    which is to say that Google and Bing,

    and other such search engines,

    they don't just have one server

    and they don't even have just one really big server,

    rather they have hundreds, thousands,

    probably hundreds of thousands or more servers nowadays

    around the world.

    And so when you and I go in and to Google or Bing

    and maybe type in a word to search for like, cats,

    it's quite possible that when you hit enter

    and that keyword like cats is sent over the internet

    to Google or to Bing, it's actually spread out ultimately

    across multiple servers,

    some of which are grabbing the first 10 results,

    some of which are grabbing the next 10 results,

    the next 10 results,

    so that you see just one collection of results,

    but a lot of those ideas,

    a lot of those search results came from different places.

    And this eliminates

    what could potentially be a bottleneck of sorts

    if all of the information you needed

    had to come from one specific server

    that might very well be busy when you have that question.

    Nick asks, Will computer programming jobs be taken

    over by AI within the next 5 to 10 years?

    This is such a frequently asked question nowadays

    and I don't think the answer will be yes.

    And I think we've seen evidence of this already

    in that early on when people were creating websites,

    they were literally writing out code

    in a language called HTML by hand.

    But then of course, software came along,

    tools like Dreamweaver that you could download

    on your own computer

    that would generate some of that same code for you.

    More recently though, now you can just sign up for websites

    like Squarespace, and Wix, and others

    whereby click, click, click

    and the website is generated for you.

    So I dare say certainly in some domains,

    that AI is really just an evolution of that trend

    and it hasn't put humans out of business

    as much as it has made you and AI much more productive.

    AI, I think, and the ability soon to be able

    to program with natural language

    is just going to enhance what you and I

    can already do logically, but much more mechanically.

    And I think too it's worth considering

    that there's just so many bugs

    or mistakes in software in the world

    and there's so many features

    that humans wish existed in products present and future

    that are to-do list, so to speak,

    is way longer than we'll ever have time

    to finish in our lifetimes.

    And so I think the prospect

    of having an artificial intelligence boost our productivity

    and work alongside us, so to speak,

    as we try to solve problems, is just gonna mean

    that you and I and the world together

    can solve so many more problems

    and move forward together at an even faster rate.

    All right, next up Sophia, who asks,

    How do microchips even work?

    It's just a green piece of metal.

    Well, here for instance, we have a whole bunch of microchips

    on what's called a logic board

    or sometimes known as a motherboard.

    There's a lot of ports

    that you might be familiar with, for instance.

    Like here's some ports for audio,

    here's some ports for networking,

    here's some ports for USB and some other devices as well.

    And those ports meanwhile are connected

    to lots of different chips on this board

    that know how to interpret the signals from those ports.

    And perhaps the biggest chip on this motherboard

    tends to be this thing here called the CPU,

    or the central processing unit,

    which is really the brains of the computer.

    And what you can't necessarily quite see,

    'cause most of this is actually paint and not traces,

    but if I flip this around, you'll actually see,

    in the right light and with the right angle,

    a whole bunch of traces running up,

    down, left, and right on this logic board

    that's connecting all of these various microchips.

    And by trace, I mean a tiny little wire

    that's been etched into the top

    or the bottom of this circuit board

    that connects two parts they're on.

    Now, what might these microchips be doing?

    Well, again, they might be simply interpreting signals

    that are coming in from these ports,

    two, they might be performing mathematical operations,

    doing something with those signals

    in order to convert input into output,

    or they might just be storing information ultimately.

    In fact, there's all different types of memory

    on a logic board like this, be it RAM, or ROM, or the like,

    and so some of those chips

    might very well be storing information

    for as long as the computer's plugged in,

    or in some cases, depending on the device,

    even when the power goes off.

    All right, next a question from Nke_chi.

    So if anyone can learn coding,

    what do computer scientists do

    for four years in university?

    Typically, in an undergraduate program in computer science,

    or computer engineering, or a similar field,

    someone spends much more time learning

    about the field itself than about programming specifically.

    So as such, you might study not only a bit of programming,

    but also mathematics, certain fundamentals

    that transcend the particular classes you might've taken

    in middle school or high school,

    but that can be used to solve grander real world problems,

    you might learn something about networks,

    how you can send information from point A to point B,

    you might learn about graphics,

    how you can display things on the screen

    or even create interactive animations or the like,

    you might learn how to leverage certain ideas

    from mathematics and other fields

    to implement your very own artificial intelligence nowadays,

    whereby you use probability and statistics

    and information more generally to try to predict

    what a intelligent individual, or in this case computer,

    might say in response to a question.

    So computer science itself is a very broad field

    and programming is really just a tool

    that you tend to learn along the way.

    From mayashelbyy,

    How do zeros and ones turn into the internet?

    Well, I think the simplest answer there

    is that the internet is built

    upon layers and layers and layers of ideas.

    And if we start at the lowest of those levels,

    zeros and ones, you have something called binary

    where zeros and ones can be used

    to represent any other numbers as well.

    And if we use more and more zeros and ones,

    more and more binary digits or bits so to speak,

    we can count up higher and higher and higher.

    And then if you and I agree that all right,

    well, let's not just use these patterns

    of zeros and ones to represent numbers,

    what if we reserve some of these patterns

    to represent letters of like the English alphabet,

    and then maybe you and I can decide

    to reserve certain patterns of zeros and ones

    to represent colors like red and green and blue

    and combinations thereof.

    Well, once we have the ability to represent colors,

    we could then represent whole pictures,

    because what's a picture on your phones or a computer screen?

    Well, it's really just a grid of dots,

    each of which has its own color.

    So this is all to say that even if we start

    at this lowest level of just zeros and ones,

    so long as you and I and all of the devices we use

    agree to follow some standard like this,

    we can build these layers and layers of abstraction,

    so to speak, on top of one another until finally,

    you and I come up with a pattern of zeros and ones

    that represents Send this piece of information

    from me over there.

    And thus, we have something like the internet.

    majinbuu asks, Can someone that knows computer science

    explain to me why computers use binary coding

    and not trinary when trinary is supposed to be faster?

    So it's not necessarily the case that a trinary system,

    which would use three symbols,

    for instance, zero, one, and two,

    would necessarily be faster than binary,

    because binary, using just zero and one,

    tends to be simpler to implement

    and also more robust to potential errors.

    Or if you're familiar with voltage levels,

    like in a battery, it's very easy for a computer

    to distinguish something for like zero volts or three volts,

    but it gets a little harder

    if we try to draw the lines somewhere in between,

    because there's just a higher probability

    that a computer might mistake a voltage level,

    like 1.5 in the middle,

    as maybe being a little closer to off than on

    or to on than off.

    Here too is where

    even though there might be mathematical efficiencies

    in real world efficiencies to using trinary,

    otherwise known as ternary, like a zero, a one,

    and a two digit instead of just zeros and ones,

    it turns out because our world runs on electricity nowadays

    and there's so much momentum behind binary

    that it just tends to be a net positive.

    rachaelp95 asks, Why is every Windows solution,

    'Have you tried restarting?'

    And why does that always work?

    So that's a very heavy handed solution

    to what are typically just bugs or mistakes in software,

    for instance, Windows in this case.

    Restarting a computer just starts everything from scratch.

    So all of the computer's short-term memory is lost

    and everything starts in pristine condition,

    which is to say that it starts

    in exactly the way that the programmers

    at Microsoft intended without potentially the distractions

    of the computer being in some weird state

    or condition that the programmers just didn't anticipate.

    Maybe you clicked on some buttons in a weird order,

    maybe you opened a strange file,

    but you maybe you got the computer into a state

    that just wasn't programmed for properly.

    Jason Witmer now asks, What's the best operating system?

    Well, this is one of these questions

    in computing we would call a religious question,

    since it evokes a religious debate

    as to which might be best.

    Of course, among the most popular operating systems

    out there are Windows and macOS,

    but there's also one you might not have heard of,

    which is called Linux, which is actually very omnipresent

    in the enterprise world.

    So many of today's servers actually run Linux

    and so many of today's desktops

    or laptops though run Windows or macOS.

    Now, that's not to say you couldn't run

    all of those operating systems in different contexts,

    and some of us do actually run Linux on our own computers,

    so a lot of it is really boils down to personal preference.

    I wouldn't even say that there's one best operating system,

    but there tend to be correlations

    between the operating systems people use

    and the applications they have in mind.

    So Windows, for instance, is so popular

    in the world of PCs and desktops and laptops.

    macOS is to some extent,

    particularly in academia and certain countries,

    but not necessarily on the same scale.

    Linux, by contrast, is again, very much used heavily

    in the server side industry, but so is Windows as well.

    So a lot of the choice for operating systems

    sometimes comes from just what's most appropriate,

    what's most popular, what's most supportive,

    but some of it comes too from just personal preference

    of the engineer, maybe the first engineer that you hire

    to make one of those decisions.

    So it's more about what's best for you

    and not so much best in general.

    Next, Giulio Magnifico asks,

    Why aren't computers getting cheaper?

    Well, computers, or at least computer parts

    inside of computers, do tend to get cheaper.

    The catch is that your expectations

    and my expectations just keep rising.

    We want our phoness, our laptops,

    our desktops to do more and more

    in the way of the software that they run,

    the games that we use,

    and just how quickly they perform for us.

    So even though some of those parts

    are getting less expensive,

    you and I want them to do more and more

    and be faster and larger in quantity,

    and so as a result, I dare say,

    that the price isn't going down as far as you might hope.

    That said, nowadays you can get,

    for the same amount of money from yesteryear,

    much, much more in the way of computing power.

    So arguably, it's working to our benefit in some cases.

    Next up from DairoNabilah,

    Can someone explain cloud computing

    to me like a five-year-old?

    Cloud computing is essentially

    you using someone else's servers

    that someone is paying to rent, for instance, or timeshare.

    So this isn't really a new idea or a new technology,

    rather it's a better branding

    of a technique that's been used for years,

    not just in the computer world,

    but in the real world as well,

    whereby someone like Google or Microsoft or Apple

    or others nowadays might be able to afford lots and lots

    and lots of servers and then make those servers available

    in part to me, to you, and many other customers as well.

    Hey, I'm Marcus.

    Hey, Marcus.

    Well, Marcus asks, How does computer memory work?

    Think of computer memory as really being driven

    by a whole bunch of switches

    that can either be turned on and off.

    So for instance, if I take this here light switch,

    which is currently off, I could simply say

    that this switch here

    is representing the number zero in binary.

    But if I turn the switch on,

    well now I can say that I'm representing the number one.

    Now, of course, I can only count as high as zero to one

    with a single light switch,

    but what if I bring over a second light switch,

    like this one here?

    If we started zero in this way,

    turn on this switch first and claim that it's one,

    let me now be more creative

    and turn this one off and this one on,

    and now claim this is how a computer's memory

    could represent the number two.

    And now if I turn this switch back on,

    giving me a fourth pattern,

    this is how I might represent the number three.

    Now, of course, if we add more and more of these switches,

    more and more of these light bulbs,

    we can count even higher than three.

    And indeed that's what a computer's memory

    is ultimately doing.

    It's using lots and lots of little tiny switches,

    otherwise known as transistors,

    to turn the flow of electricity on and off,

    and then it's got other types of hardware

    called, for instance, capacitors

    that have a capacity to hold onto some of that electricity

    just like the light bulb there being on.

    All right, next, Donny asks,

    How do you explain Web3 to people?

    So Web3, like Web 2 and retrospectively, Web 1,

    are really just buzzwords that describe sort of phases

    of the internet or the worldwide web as you and I know it.

    For instance, back in the day,

    when there was just the worldwide web,

    now perhaps referred to as Web version one

    information was largely static.

    If you were to create a website on the internet,

    you would type up your code, you would type up your content,

    you would put it on a server somewhere,

    and someone could read that information,

    but it was you, the web developer,

    or you, the owner of the website,

    that was creating that content

    for other people to actually read and consume.

    In Web 2, the world became much more dynamic in recent years

    whereby now websites tend to have databases

    and they have more sophistication,

    so that a lot of the content in websites today

    are actually coming from me and from you.

    So if you think of any social media site,

    it's not the owners of those sites

    that are creating most of the content,

    it's you and me as the users of those same websites.

    But in Web 2, everything is nonetheless very centralized,

    whether you're Twitter or Facebook, now Meta,

    or other companies, all of that data,

    even in the world of social media,

    that's coming from me and you

    is actually being stored centrally on those company servers.

    So Web 3.0 or Web3, so to speak,

    is really about transitioning away potentially

    from that very centralized model

    to one that's more distributed, where the data

    that you and I are creating,

    whereby the data you and I are consuming,

    is actually distributed over multiple servers

    over a technique called blockchain,

    for instance in some cases,

    whereby there's not necessarily one owner of that data,

    but really collective ownership and therefore verification

    that the data maybe indeed came from me and you.

    Next, a question from gomotigers,

    Can someone explain to me the difference

    between firmware and software?

    Hardware is physical, software is code, wtf is firmware?

    Firmware is really a synonym for a type of software.

    So firmware is just software,

    but it tends to be software

    that comes built into your hardware.

    And you can think of in the simplest scenario

    that firmware is software

    that is just completely integrated into the hardware

    and itself cannot be changed or even upgraded.

    But that's a bit of an oversimplification,

    because even firmware typically,

    when it comes in a computer,

    when it comes in a phones, or some other device,

    can very often be updated.

    Why?

    Because the firmware is the software

    that's really closest to the hardware,

    and in that sense, it might very well be the most important.

    And if anything goes wrong with the firmware,

    you might not even be able to turn that device on,

    whether it's a phones, a computer,

    or even your refrigerator nowadays.

    All right, that's all the questions for today.

    We hoped you learned a little something along the way.

    We'll see you next time.

    Up Next
    bet365娱乐