Why was hardware diversification an asset for the IBM PC ecosystem?Why weren't bootable game disks ever common on the IBM PC?What are all the fixed addresses of AmigaOS?What PC “Clone” technology standards were introduced by clone manufacturers?What was the first C compiler for the IBM PC?Why was Windows 3.1 so dreadfully slow on an Amiga A2286 bridgeboard?Who set the 640K limit?Simplest system to create an emulator forDid any machines alternate between two video memory banks?Why separate cursor keys?How did early x86 BIOS programmers manage to program full blown TUIs given very few bytes of ROM/EPROM?Why weren't bootable game disks ever common on the IBM PC?

For a hashing function like MD5, how similar can two plaintext strings be and still generate the same hash?

Is "De qui parles-tu" (for example) as formal as it is in English, or is it normal for the French to casually say that

Do I have a right to cancel a purchase of foreign currency in the UK?

Single word for "refusing to move to next activity unless present one is completed."

Print the last, middle and first character of your code

Why did Harry Potter get a bedroom?

How to deal with moral/legal subjects in writing?

Why is the ladder of the LM always in the dark side of the LM?

Was I subtly told to resign?

What specific instant in time in the MCU has been depicted the most times?

What is the measurable difference between dry basil and fresh?

Why do people keep referring to Leia as Princess Leia, even after the destruction of Alderaan?

How would vampires avoid contracting diseases?

Why return a static pointer instead of an out parameter?

Would dual wielding daggers be a viable choice for a covert bodyguard?

Is "I do not want you to go nowhere" a case of "DOUBLE-NEGATIVES" as claimed by Grammarly?

Would a non-attacking Barbarian's rage end the same turn he started it?

LED glows slightly during soldering

Is there any reason why MCU changed the Snap to Blip

How were Martello towers supposed to work?

Misspelling my name on my mathematical publications

What's the point of having a RAID 1 configuration over incremental backups to a secondary drive?

Received a dinner invitation through my employer's email, is it ok to attend?

How can I effectively communicate to recruiters that a phone call is not possible?



Why was hardware diversification an asset for the IBM PC ecosystem?


Why weren't bootable game disks ever common on the IBM PC?What are all the fixed addresses of AmigaOS?What PC “Clone” technology standards were introduced by clone manufacturers?What was the first C compiler for the IBM PC?Why was Windows 3.1 so dreadfully slow on an Amiga A2286 bridgeboard?Who set the 640K limit?Simplest system to create an emulator forDid any machines alternate between two video memory banks?Why separate cursor keys?How did early x86 BIOS programmers manage to program full blown TUIs given very few bytes of ROM/EPROM?Why weren't bootable game disks ever common on the IBM PC?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








3















The IBM PC was cloned very early on, and many third parties made hardware peripherals. This required users to run an OS, install drivers, manage IRQs and hardware bus addresses, etc. Why was this a strength of the platform, instead of a weakness?



I'm thinking specifically of Commodore/Amiga where the hardware was less diverse and end users/programs didn't need to manage so much fragmentation. I'm also thinking of modern examples like Android where fragmentation is a big problem, and iOS where the hardware is less diverse and that is seen as an asset of the platform in ways.



Inspired by Why weren't bootable game disks ever common on the IBM PC?










share|improve this question






















  • Not just a hammer but the tool you need. Then you don't need other tools.

    – Thorbjørn Ravn Andersen
    9 hours ago

















3















The IBM PC was cloned very early on, and many third parties made hardware peripherals. This required users to run an OS, install drivers, manage IRQs and hardware bus addresses, etc. Why was this a strength of the platform, instead of a weakness?



I'm thinking specifically of Commodore/Amiga where the hardware was less diverse and end users/programs didn't need to manage so much fragmentation. I'm also thinking of modern examples like Android where fragmentation is a big problem, and iOS where the hardware is less diverse and that is seen as an asset of the platform in ways.



Inspired by Why weren't bootable game disks ever common on the IBM PC?










share|improve this question






















  • Not just a hammer but the tool you need. Then you don't need other tools.

    – Thorbjørn Ravn Andersen
    9 hours ago













3












3








3








The IBM PC was cloned very early on, and many third parties made hardware peripherals. This required users to run an OS, install drivers, manage IRQs and hardware bus addresses, etc. Why was this a strength of the platform, instead of a weakness?



I'm thinking specifically of Commodore/Amiga where the hardware was less diverse and end users/programs didn't need to manage so much fragmentation. I'm also thinking of modern examples like Android where fragmentation is a big problem, and iOS where the hardware is less diverse and that is seen as an asset of the platform in ways.



Inspired by Why weren't bootable game disks ever common on the IBM PC?










share|improve this question














The IBM PC was cloned very early on, and many third parties made hardware peripherals. This required users to run an OS, install drivers, manage IRQs and hardware bus addresses, etc. Why was this a strength of the platform, instead of a weakness?



I'm thinking specifically of Commodore/Amiga where the hardware was less diverse and end users/programs didn't need to manage so much fragmentation. I'm also thinking of modern examples like Android where fragmentation is a big problem, and iOS where the hardware is less diverse and that is seen as an asset of the platform in ways.



Inspired by Why weren't bootable game disks ever common on the IBM PC?







hardware amiga ibm-pc






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked 9 hours ago









nexus_2006nexus_2006

3082 silver badges6 bronze badges




3082 silver badges6 bronze badges












  • Not just a hammer but the tool you need. Then you don't need other tools.

    – Thorbjørn Ravn Andersen
    9 hours ago

















  • Not just a hammer but the tool you need. Then you don't need other tools.

    – Thorbjørn Ravn Andersen
    9 hours ago
















Not just a hammer but the tool you need. Then you don't need other tools.

– Thorbjørn Ravn Andersen
9 hours ago





Not just a hammer but the tool you need. Then you don't need other tools.

– Thorbjørn Ravn Andersen
9 hours ago










6 Answers
6






active

oldest

votes


















2














Unlike the Amiga, the IBM PC was always seen as mainly a business computer. Having business done on lots of small machines on desks, rather than on large central machines via terminals, meant that businesses had to provide support staff to configure machines and assist users, rather than having staff to look after the central machines.



However, the total cost of ownership would seem to have been lower, judging by the runaway success of PC-based business computing. PCs also became less demanding to configure over time, as they came with more and more built-in capability.



Hardware diversification also provided ways for PC manufacturers to compete, driving costs down and ease-of-use up. However, they always remained PC-compatible, which meant they could run the same application software.



The fragmentation of Android, in my experience, has been caused by the fragmentation of the ARM 32-bit instruction set, which was a consequence of ARM's willingness to customise their processor designs for their hardware-manufacturer customers, and those hardware manufacturers' lack of appreciation for the benefits of application compatibility. It's just the same situation as you'd get if Dell and HP Windows machines had different instruction sets, custom-designed for Dell and HP by Intel. As a chip manufacturer, wanting the benefits of making millions of identical chips, Intel don't do that. ARM discarded the idea with ARMv8, which does not get customised by individual customers, curing the fragmentation problem.






share|improve this answer

























  • The existence of a range of "vendor option" opcodes that behave differently on different platforms would only cause compatibility problems if machine-language programmers or compilers explicitly used them without first checking via some means whether the target platform supported the desired instructions.

    – supercat
    7 hours ago


















1















The IBM PC was cloned very early on, and many third parties made
hardware peripherals. This required users to run an OS, install
drivers, manage IRQs and hardware bus addresses, etc. Why was this a
strength of the platform, instead of a weakness?




A wide range of hardware devices was a strength, in that, if IBM wasn't willing to build it, or was unable to build it and release it in a timely manner, then someone else (perhaps many someone elses) could. This broadens the range of use for the platform as a whole. That makes for a virtuous cycle: greater range, more users, more incentive to support the system.



Dealing with the hardware configuration was a downside of the variety, sure, but apparently not a sufficient disincentive.



(The PDP-11 followed a similar arc, especially in its early days, where the documented bus architecture allowed a robust add-on marketplace to develop).






share|improve this answer






























    1














    One problem in designing future-proof systems is deciding what features of the system's present design should be regarded as fundamental and what aspects should be considered happenstance. If some features of a design get treated as fundamental, it will be very difficult to change them later, but features needed to accomplish tasks efficiently can't be relied upon, it will be impossible to accomplish those tasks reliably and efficiently.



    Some aspects of the PC's design (e.g. the presence of CGA display memory at hardware address 0xB8000) ended up being treated as fundamental because IBM and Microsoft failed to provide a reasonable alternative, even though it would have been better to avoid such dependance (e.g. by providing a function to inquire where display memory is located). For the most part, though, the PC ended up with a relatively practical mix of fundamental and happenstance design elements.






    share|improve this answer
































      1














      It was an advantage because the IBM PC became an extensible computing platform. The most popular competitor to it previously was the Apple II, another open platform.



      The PC, as a platform, was popular to users because of the choices it enabled, as developers because of the foundations that it laid, and engineering firms because they could focus on what they wanted to do (say, a graphic card, or a computer vision system) rather than an entire computing system.



      Since MS-DOS availed the ability to readily install things like drivers, the platform allowed things like configurable interrupts, etc. it gave everyone involved a lot of flexibility in designing their solutions.



      So, not only was the hardware "open" (the PC bus system was reasonably well documented), the software was also "open" (in that it was straight forward to augment the system with your own drivers). I don't mean "open" as it's typically referred to today (i.e. free, source code available, etc.) but it was "open" in the sense that anyone could make a card, with no payment to IBM, and anyone could make a driver, with no payment to Microsoft. You also, as an end user, didn't have to rebuild the operating system software yourself like you did in CP/M. CP/Ms design wasn't as flexible in this regard as MS-DOS was. (This changed a little later, but the legacy lived on.)



      Yes, this lead to complexity at the user level since the burden of configuring the cornucopia of diverse systems fell on them. It also lead to some complexity at the software development level - but only at the edge cases. In time things became common and compatible, lowering the barrier for software.



      So, since the PC offered a unique combination at the time of a robust foundation, especially once the XT hit, of the BIOS, large memory space, the large array of peripherals (robust character graphics, nice hi res graphics, color graphics, floppies, hard drives, printers, serial ports) and then the "infinite" expandability of an open bus and extensible base operating system, and then the explosion of the clones, the network effect surrounding the PC became very compelling despite all of its innate weaknesses.



      The PC represented a maturity of the product and the industry after learning from the rise and struggles of the diverse early micro environment where everything was cordoned off to a limited platform.






      share|improve this answer






























        1














        The hardware in PC clones was not particularly diverse, if the technical characteristics of the hardware is the distinguishing factor. Yes, there was a great diversity of manufacturers and vendors, but the diversity of features was pretty minimal. It is probably best to describe the PC clones of the 1980s as vanilla computers. And this was their real strength.



        Because the hardware was "PC compatible", having the basic features required to make it so, it could be counted on to run the popular MS-DOS applications of the time. This actually liberated the user from having to think very much about the diversity of vendors. Most buyers of PC clones at the time weren't venturing into esoteric hardware for highly specific use-cases, nor were they buying components and assembling a properly configured PC themselves. They just wanted a machine to run Wordperfect, Lotus 1-2-3, dBase III, and other popular business titles. Therefore, their buying decision was generally driven by the price-to-performance ratio, and the performance was mainly a matter of CPU (e.g. 8088 vs. 80286 vs. 80386) and clock speed. This allowed for fairly simple buyer comparisons.



        So, the diversity of vendors created little real hassle for buyers (they always got the same basic features and compatibility), while also fostering a highly-competitive market where price/performance ratio fell rapidly. It was the rapidity of falling prices with rising performance that allowed the PC clones to overwhelm the competition from Apple, Commodore, etc.






        share|improve this answer

























        • I would modify this answer that the diversity came later. Initially, clones were practically exact duplicates of IBM's design. There weren't a lot of expansion cards initially. Thus, there wasn't much hardware to write software for, initially. But the success of the platform and its expansion slots were eventually a draw for developers of 3rd-party add-ons.

          – DrSheldon
          2 hours ago


















        0














        First, having the plans available allowed small companies to build their own IBM PC clones. You didn't have to be a monolithic company like IBM to build one. A modern analogy is that you don't have to be Google to design an Android phone.



        Second, each of these numerous little companies did their own marketing, greatly expanding the platform's visibility in the marketplace.



        Third, these companies often shared technology, reducing development and manufacturing costs through economies of scale. Chipsets, CPUs, BIOS firmware, etc.






        share|improve this answer

























          Your Answer








          StackExchange.ready(function()
          var channelOptions =
          tags: "".split(" "),
          id: "648"
          ;
          initTagRenderer("".split(" "), "".split(" "), channelOptions);

          StackExchange.using("externalEditor", function()
          // Have to fire editor after snippets, if snippets enabled
          if (StackExchange.settings.snippets.snippetsEnabled)
          StackExchange.using("snippets", function()
          createEditor();
          );

          else
          createEditor();

          );

          function createEditor()
          StackExchange.prepareEditor(
          heartbeatType: 'answer',
          autoActivateHeartbeat: false,
          convertImagesToLinks: false,
          noModals: true,
          showLowRepImageUploadWarning: true,
          reputationToPostImages: null,
          bindNavPrevention: true,
          postfix: "",
          imageUploader:
          brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
          contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
          allowUrls: true
          ,
          noCode: true, onDemand: true,
          discardSelector: ".discard-answer"
          ,immediatelyShowMarkdownHelp:true
          );



          );













          draft saved

          draft discarded


















          StackExchange.ready(
          function ()
          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f11606%2fwhy-was-hardware-diversification-an-asset-for-the-ibm-pc-ecosystem%23new-answer', 'question_page');

          );

          Post as a guest















          Required, but never shown

























          6 Answers
          6






          active

          oldest

          votes








          6 Answers
          6






          active

          oldest

          votes









          active

          oldest

          votes






          active

          oldest

          votes









          2














          Unlike the Amiga, the IBM PC was always seen as mainly a business computer. Having business done on lots of small machines on desks, rather than on large central machines via terminals, meant that businesses had to provide support staff to configure machines and assist users, rather than having staff to look after the central machines.



          However, the total cost of ownership would seem to have been lower, judging by the runaway success of PC-based business computing. PCs also became less demanding to configure over time, as they came with more and more built-in capability.



          Hardware diversification also provided ways for PC manufacturers to compete, driving costs down and ease-of-use up. However, they always remained PC-compatible, which meant they could run the same application software.



          The fragmentation of Android, in my experience, has been caused by the fragmentation of the ARM 32-bit instruction set, which was a consequence of ARM's willingness to customise their processor designs for their hardware-manufacturer customers, and those hardware manufacturers' lack of appreciation for the benefits of application compatibility. It's just the same situation as you'd get if Dell and HP Windows machines had different instruction sets, custom-designed for Dell and HP by Intel. As a chip manufacturer, wanting the benefits of making millions of identical chips, Intel don't do that. ARM discarded the idea with ARMv8, which does not get customised by individual customers, curing the fragmentation problem.






          share|improve this answer

























          • The existence of a range of "vendor option" opcodes that behave differently on different platforms would only cause compatibility problems if machine-language programmers or compilers explicitly used them without first checking via some means whether the target platform supported the desired instructions.

            – supercat
            7 hours ago















          2














          Unlike the Amiga, the IBM PC was always seen as mainly a business computer. Having business done on lots of small machines on desks, rather than on large central machines via terminals, meant that businesses had to provide support staff to configure machines and assist users, rather than having staff to look after the central machines.



          However, the total cost of ownership would seem to have been lower, judging by the runaway success of PC-based business computing. PCs also became less demanding to configure over time, as they came with more and more built-in capability.



          Hardware diversification also provided ways for PC manufacturers to compete, driving costs down and ease-of-use up. However, they always remained PC-compatible, which meant they could run the same application software.



          The fragmentation of Android, in my experience, has been caused by the fragmentation of the ARM 32-bit instruction set, which was a consequence of ARM's willingness to customise their processor designs for their hardware-manufacturer customers, and those hardware manufacturers' lack of appreciation for the benefits of application compatibility. It's just the same situation as you'd get if Dell and HP Windows machines had different instruction sets, custom-designed for Dell and HP by Intel. As a chip manufacturer, wanting the benefits of making millions of identical chips, Intel don't do that. ARM discarded the idea with ARMv8, which does not get customised by individual customers, curing the fragmentation problem.






          share|improve this answer

























          • The existence of a range of "vendor option" opcodes that behave differently on different platforms would only cause compatibility problems if machine-language programmers or compilers explicitly used them without first checking via some means whether the target platform supported the desired instructions.

            – supercat
            7 hours ago













          2












          2








          2







          Unlike the Amiga, the IBM PC was always seen as mainly a business computer. Having business done on lots of small machines on desks, rather than on large central machines via terminals, meant that businesses had to provide support staff to configure machines and assist users, rather than having staff to look after the central machines.



          However, the total cost of ownership would seem to have been lower, judging by the runaway success of PC-based business computing. PCs also became less demanding to configure over time, as they came with more and more built-in capability.



          Hardware diversification also provided ways for PC manufacturers to compete, driving costs down and ease-of-use up. However, they always remained PC-compatible, which meant they could run the same application software.



          The fragmentation of Android, in my experience, has been caused by the fragmentation of the ARM 32-bit instruction set, which was a consequence of ARM's willingness to customise their processor designs for their hardware-manufacturer customers, and those hardware manufacturers' lack of appreciation for the benefits of application compatibility. It's just the same situation as you'd get if Dell and HP Windows machines had different instruction sets, custom-designed for Dell and HP by Intel. As a chip manufacturer, wanting the benefits of making millions of identical chips, Intel don't do that. ARM discarded the idea with ARMv8, which does not get customised by individual customers, curing the fragmentation problem.






          share|improve this answer















          Unlike the Amiga, the IBM PC was always seen as mainly a business computer. Having business done on lots of small machines on desks, rather than on large central machines via terminals, meant that businesses had to provide support staff to configure machines and assist users, rather than having staff to look after the central machines.



          However, the total cost of ownership would seem to have been lower, judging by the runaway success of PC-based business computing. PCs also became less demanding to configure over time, as they came with more and more built-in capability.



          Hardware diversification also provided ways for PC manufacturers to compete, driving costs down and ease-of-use up. However, they always remained PC-compatible, which meant they could run the same application software.



          The fragmentation of Android, in my experience, has been caused by the fragmentation of the ARM 32-bit instruction set, which was a consequence of ARM's willingness to customise their processor designs for their hardware-manufacturer customers, and those hardware manufacturers' lack of appreciation for the benefits of application compatibility. It's just the same situation as you'd get if Dell and HP Windows machines had different instruction sets, custom-designed for Dell and HP by Intel. As a chip manufacturer, wanting the benefits of making millions of identical chips, Intel don't do that. ARM discarded the idea with ARMv8, which does not get customised by individual customers, curing the fragmentation problem.







          share|improve this answer














          share|improve this answer



          share|improve this answer








          edited 9 hours ago

























          answered 9 hours ago









          John DallmanJohn Dallman

          4,4441 gold badge12 silver badges19 bronze badges




          4,4441 gold badge12 silver badges19 bronze badges












          • The existence of a range of "vendor option" opcodes that behave differently on different platforms would only cause compatibility problems if machine-language programmers or compilers explicitly used them without first checking via some means whether the target platform supported the desired instructions.

            – supercat
            7 hours ago

















          • The existence of a range of "vendor option" opcodes that behave differently on different platforms would only cause compatibility problems if machine-language programmers or compilers explicitly used them without first checking via some means whether the target platform supported the desired instructions.

            – supercat
            7 hours ago
















          The existence of a range of "vendor option" opcodes that behave differently on different platforms would only cause compatibility problems if machine-language programmers or compilers explicitly used them without first checking via some means whether the target platform supported the desired instructions.

          – supercat
          7 hours ago





          The existence of a range of "vendor option" opcodes that behave differently on different platforms would only cause compatibility problems if machine-language programmers or compilers explicitly used them without first checking via some means whether the target platform supported the desired instructions.

          – supercat
          7 hours ago













          1















          The IBM PC was cloned very early on, and many third parties made
          hardware peripherals. This required users to run an OS, install
          drivers, manage IRQs and hardware bus addresses, etc. Why was this a
          strength of the platform, instead of a weakness?




          A wide range of hardware devices was a strength, in that, if IBM wasn't willing to build it, or was unable to build it and release it in a timely manner, then someone else (perhaps many someone elses) could. This broadens the range of use for the platform as a whole. That makes for a virtuous cycle: greater range, more users, more incentive to support the system.



          Dealing with the hardware configuration was a downside of the variety, sure, but apparently not a sufficient disincentive.



          (The PDP-11 followed a similar arc, especially in its early days, where the documented bus architecture allowed a robust add-on marketplace to develop).






          share|improve this answer



























            1















            The IBM PC was cloned very early on, and many third parties made
            hardware peripherals. This required users to run an OS, install
            drivers, manage IRQs and hardware bus addresses, etc. Why was this a
            strength of the platform, instead of a weakness?




            A wide range of hardware devices was a strength, in that, if IBM wasn't willing to build it, or was unable to build it and release it in a timely manner, then someone else (perhaps many someone elses) could. This broadens the range of use for the platform as a whole. That makes for a virtuous cycle: greater range, more users, more incentive to support the system.



            Dealing with the hardware configuration was a downside of the variety, sure, but apparently not a sufficient disincentive.



            (The PDP-11 followed a similar arc, especially in its early days, where the documented bus architecture allowed a robust add-on marketplace to develop).






            share|improve this answer

























              1












              1








              1








              The IBM PC was cloned very early on, and many third parties made
              hardware peripherals. This required users to run an OS, install
              drivers, manage IRQs and hardware bus addresses, etc. Why was this a
              strength of the platform, instead of a weakness?




              A wide range of hardware devices was a strength, in that, if IBM wasn't willing to build it, or was unable to build it and release it in a timely manner, then someone else (perhaps many someone elses) could. This broadens the range of use for the platform as a whole. That makes for a virtuous cycle: greater range, more users, more incentive to support the system.



              Dealing with the hardware configuration was a downside of the variety, sure, but apparently not a sufficient disincentive.



              (The PDP-11 followed a similar arc, especially in its early days, where the documented bus architecture allowed a robust add-on marketplace to develop).






              share|improve this answer














              The IBM PC was cloned very early on, and many third parties made
              hardware peripherals. This required users to run an OS, install
              drivers, manage IRQs and hardware bus addresses, etc. Why was this a
              strength of the platform, instead of a weakness?




              A wide range of hardware devices was a strength, in that, if IBM wasn't willing to build it, or was unable to build it and release it in a timely manner, then someone else (perhaps many someone elses) could. This broadens the range of use for the platform as a whole. That makes for a virtuous cycle: greater range, more users, more incentive to support the system.



              Dealing with the hardware configuration was a downside of the variety, sure, but apparently not a sufficient disincentive.



              (The PDP-11 followed a similar arc, especially in its early days, where the documented bus architecture allowed a robust add-on marketplace to develop).







              share|improve this answer












              share|improve this answer



              share|improve this answer










              answered 8 hours ago









              another-daveanother-dave

              2,1446 silver badges22 bronze badges




              2,1446 silver badges22 bronze badges





















                  1














                  One problem in designing future-proof systems is deciding what features of the system's present design should be regarded as fundamental and what aspects should be considered happenstance. If some features of a design get treated as fundamental, it will be very difficult to change them later, but features needed to accomplish tasks efficiently can't be relied upon, it will be impossible to accomplish those tasks reliably and efficiently.



                  Some aspects of the PC's design (e.g. the presence of CGA display memory at hardware address 0xB8000) ended up being treated as fundamental because IBM and Microsoft failed to provide a reasonable alternative, even though it would have been better to avoid such dependance (e.g. by providing a function to inquire where display memory is located). For the most part, though, the PC ended up with a relatively practical mix of fundamental and happenstance design elements.






                  share|improve this answer





























                    1














                    One problem in designing future-proof systems is deciding what features of the system's present design should be regarded as fundamental and what aspects should be considered happenstance. If some features of a design get treated as fundamental, it will be very difficult to change them later, but features needed to accomplish tasks efficiently can't be relied upon, it will be impossible to accomplish those tasks reliably and efficiently.



                    Some aspects of the PC's design (e.g. the presence of CGA display memory at hardware address 0xB8000) ended up being treated as fundamental because IBM and Microsoft failed to provide a reasonable alternative, even though it would have been better to avoid such dependance (e.g. by providing a function to inquire where display memory is located). For the most part, though, the PC ended up with a relatively practical mix of fundamental and happenstance design elements.






                    share|improve this answer



























                      1












                      1








                      1







                      One problem in designing future-proof systems is deciding what features of the system's present design should be regarded as fundamental and what aspects should be considered happenstance. If some features of a design get treated as fundamental, it will be very difficult to change them later, but features needed to accomplish tasks efficiently can't be relied upon, it will be impossible to accomplish those tasks reliably and efficiently.



                      Some aspects of the PC's design (e.g. the presence of CGA display memory at hardware address 0xB8000) ended up being treated as fundamental because IBM and Microsoft failed to provide a reasonable alternative, even though it would have been better to avoid such dependance (e.g. by providing a function to inquire where display memory is located). For the most part, though, the PC ended up with a relatively practical mix of fundamental and happenstance design elements.






                      share|improve this answer















                      One problem in designing future-proof systems is deciding what features of the system's present design should be regarded as fundamental and what aspects should be considered happenstance. If some features of a design get treated as fundamental, it will be very difficult to change them later, but features needed to accomplish tasks efficiently can't be relied upon, it will be impossible to accomplish those tasks reliably and efficiently.



                      Some aspects of the PC's design (e.g. the presence of CGA display memory at hardware address 0xB8000) ended up being treated as fundamental because IBM and Microsoft failed to provide a reasonable alternative, even though it would have been better to avoid such dependance (e.g. by providing a function to inquire where display memory is located). For the most part, though, the PC ended up with a relatively practical mix of fundamental and happenstance design elements.







                      share|improve this answer














                      share|improve this answer



                      share|improve this answer








                      edited 8 hours ago

























                      answered 9 hours ago









                      supercatsupercat

                      10.2k2 gold badges14 silver badges46 bronze badges




                      10.2k2 gold badges14 silver badges46 bronze badges





















                          1














                          It was an advantage because the IBM PC became an extensible computing platform. The most popular competitor to it previously was the Apple II, another open platform.



                          The PC, as a platform, was popular to users because of the choices it enabled, as developers because of the foundations that it laid, and engineering firms because they could focus on what they wanted to do (say, a graphic card, or a computer vision system) rather than an entire computing system.



                          Since MS-DOS availed the ability to readily install things like drivers, the platform allowed things like configurable interrupts, etc. it gave everyone involved a lot of flexibility in designing their solutions.



                          So, not only was the hardware "open" (the PC bus system was reasonably well documented), the software was also "open" (in that it was straight forward to augment the system with your own drivers). I don't mean "open" as it's typically referred to today (i.e. free, source code available, etc.) but it was "open" in the sense that anyone could make a card, with no payment to IBM, and anyone could make a driver, with no payment to Microsoft. You also, as an end user, didn't have to rebuild the operating system software yourself like you did in CP/M. CP/Ms design wasn't as flexible in this regard as MS-DOS was. (This changed a little later, but the legacy lived on.)



                          Yes, this lead to complexity at the user level since the burden of configuring the cornucopia of diverse systems fell on them. It also lead to some complexity at the software development level - but only at the edge cases. In time things became common and compatible, lowering the barrier for software.



                          So, since the PC offered a unique combination at the time of a robust foundation, especially once the XT hit, of the BIOS, large memory space, the large array of peripherals (robust character graphics, nice hi res graphics, color graphics, floppies, hard drives, printers, serial ports) and then the "infinite" expandability of an open bus and extensible base operating system, and then the explosion of the clones, the network effect surrounding the PC became very compelling despite all of its innate weaknesses.



                          The PC represented a maturity of the product and the industry after learning from the rise and struggles of the diverse early micro environment where everything was cordoned off to a limited platform.






                          share|improve this answer



























                            1














                            It was an advantage because the IBM PC became an extensible computing platform. The most popular competitor to it previously was the Apple II, another open platform.



                            The PC, as a platform, was popular to users because of the choices it enabled, as developers because of the foundations that it laid, and engineering firms because they could focus on what they wanted to do (say, a graphic card, or a computer vision system) rather than an entire computing system.



                            Since MS-DOS availed the ability to readily install things like drivers, the platform allowed things like configurable interrupts, etc. it gave everyone involved a lot of flexibility in designing their solutions.



                            So, not only was the hardware "open" (the PC bus system was reasonably well documented), the software was also "open" (in that it was straight forward to augment the system with your own drivers). I don't mean "open" as it's typically referred to today (i.e. free, source code available, etc.) but it was "open" in the sense that anyone could make a card, with no payment to IBM, and anyone could make a driver, with no payment to Microsoft. You also, as an end user, didn't have to rebuild the operating system software yourself like you did in CP/M. CP/Ms design wasn't as flexible in this regard as MS-DOS was. (This changed a little later, but the legacy lived on.)



                            Yes, this lead to complexity at the user level since the burden of configuring the cornucopia of diverse systems fell on them. It also lead to some complexity at the software development level - but only at the edge cases. In time things became common and compatible, lowering the barrier for software.



                            So, since the PC offered a unique combination at the time of a robust foundation, especially once the XT hit, of the BIOS, large memory space, the large array of peripherals (robust character graphics, nice hi res graphics, color graphics, floppies, hard drives, printers, serial ports) and then the "infinite" expandability of an open bus and extensible base operating system, and then the explosion of the clones, the network effect surrounding the PC became very compelling despite all of its innate weaknesses.



                            The PC represented a maturity of the product and the industry after learning from the rise and struggles of the diverse early micro environment where everything was cordoned off to a limited platform.






                            share|improve this answer

























                              1












                              1








                              1







                              It was an advantage because the IBM PC became an extensible computing platform. The most popular competitor to it previously was the Apple II, another open platform.



                              The PC, as a platform, was popular to users because of the choices it enabled, as developers because of the foundations that it laid, and engineering firms because they could focus on what they wanted to do (say, a graphic card, or a computer vision system) rather than an entire computing system.



                              Since MS-DOS availed the ability to readily install things like drivers, the platform allowed things like configurable interrupts, etc. it gave everyone involved a lot of flexibility in designing their solutions.



                              So, not only was the hardware "open" (the PC bus system was reasonably well documented), the software was also "open" (in that it was straight forward to augment the system with your own drivers). I don't mean "open" as it's typically referred to today (i.e. free, source code available, etc.) but it was "open" in the sense that anyone could make a card, with no payment to IBM, and anyone could make a driver, with no payment to Microsoft. You also, as an end user, didn't have to rebuild the operating system software yourself like you did in CP/M. CP/Ms design wasn't as flexible in this regard as MS-DOS was. (This changed a little later, but the legacy lived on.)



                              Yes, this lead to complexity at the user level since the burden of configuring the cornucopia of diverse systems fell on them. It also lead to some complexity at the software development level - but only at the edge cases. In time things became common and compatible, lowering the barrier for software.



                              So, since the PC offered a unique combination at the time of a robust foundation, especially once the XT hit, of the BIOS, large memory space, the large array of peripherals (robust character graphics, nice hi res graphics, color graphics, floppies, hard drives, printers, serial ports) and then the "infinite" expandability of an open bus and extensible base operating system, and then the explosion of the clones, the network effect surrounding the PC became very compelling despite all of its innate weaknesses.



                              The PC represented a maturity of the product and the industry after learning from the rise and struggles of the diverse early micro environment where everything was cordoned off to a limited platform.






                              share|improve this answer













                              It was an advantage because the IBM PC became an extensible computing platform. The most popular competitor to it previously was the Apple II, another open platform.



                              The PC, as a platform, was popular to users because of the choices it enabled, as developers because of the foundations that it laid, and engineering firms because they could focus on what they wanted to do (say, a graphic card, or a computer vision system) rather than an entire computing system.



                              Since MS-DOS availed the ability to readily install things like drivers, the platform allowed things like configurable interrupts, etc. it gave everyone involved a lot of flexibility in designing their solutions.



                              So, not only was the hardware "open" (the PC bus system was reasonably well documented), the software was also "open" (in that it was straight forward to augment the system with your own drivers). I don't mean "open" as it's typically referred to today (i.e. free, source code available, etc.) but it was "open" in the sense that anyone could make a card, with no payment to IBM, and anyone could make a driver, with no payment to Microsoft. You also, as an end user, didn't have to rebuild the operating system software yourself like you did in CP/M. CP/Ms design wasn't as flexible in this regard as MS-DOS was. (This changed a little later, but the legacy lived on.)



                              Yes, this lead to complexity at the user level since the burden of configuring the cornucopia of diverse systems fell on them. It also lead to some complexity at the software development level - but only at the edge cases. In time things became common and compatible, lowering the barrier for software.



                              So, since the PC offered a unique combination at the time of a robust foundation, especially once the XT hit, of the BIOS, large memory space, the large array of peripherals (robust character graphics, nice hi res graphics, color graphics, floppies, hard drives, printers, serial ports) and then the "infinite" expandability of an open bus and extensible base operating system, and then the explosion of the clones, the network effect surrounding the PC became very compelling despite all of its innate weaknesses.



                              The PC represented a maturity of the product and the industry after learning from the rise and struggles of the diverse early micro environment where everything was cordoned off to a limited platform.







                              share|improve this answer












                              share|improve this answer



                              share|improve this answer










                              answered 7 hours ago









                              Will HartungWill Hartung

                              4,36610 silver badges24 bronze badges




                              4,36610 silver badges24 bronze badges





















                                  1














                                  The hardware in PC clones was not particularly diverse, if the technical characteristics of the hardware is the distinguishing factor. Yes, there was a great diversity of manufacturers and vendors, but the diversity of features was pretty minimal. It is probably best to describe the PC clones of the 1980s as vanilla computers. And this was their real strength.



                                  Because the hardware was "PC compatible", having the basic features required to make it so, it could be counted on to run the popular MS-DOS applications of the time. This actually liberated the user from having to think very much about the diversity of vendors. Most buyers of PC clones at the time weren't venturing into esoteric hardware for highly specific use-cases, nor were they buying components and assembling a properly configured PC themselves. They just wanted a machine to run Wordperfect, Lotus 1-2-3, dBase III, and other popular business titles. Therefore, their buying decision was generally driven by the price-to-performance ratio, and the performance was mainly a matter of CPU (e.g. 8088 vs. 80286 vs. 80386) and clock speed. This allowed for fairly simple buyer comparisons.



                                  So, the diversity of vendors created little real hassle for buyers (they always got the same basic features and compatibility), while also fostering a highly-competitive market where price/performance ratio fell rapidly. It was the rapidity of falling prices with rising performance that allowed the PC clones to overwhelm the competition from Apple, Commodore, etc.






                                  share|improve this answer

























                                  • I would modify this answer that the diversity came later. Initially, clones were practically exact duplicates of IBM's design. There weren't a lot of expansion cards initially. Thus, there wasn't much hardware to write software for, initially. But the success of the platform and its expansion slots were eventually a draw for developers of 3rd-party add-ons.

                                    – DrSheldon
                                    2 hours ago















                                  1














                                  The hardware in PC clones was not particularly diverse, if the technical characteristics of the hardware is the distinguishing factor. Yes, there was a great diversity of manufacturers and vendors, but the diversity of features was pretty minimal. It is probably best to describe the PC clones of the 1980s as vanilla computers. And this was their real strength.



                                  Because the hardware was "PC compatible", having the basic features required to make it so, it could be counted on to run the popular MS-DOS applications of the time. This actually liberated the user from having to think very much about the diversity of vendors. Most buyers of PC clones at the time weren't venturing into esoteric hardware for highly specific use-cases, nor were they buying components and assembling a properly configured PC themselves. They just wanted a machine to run Wordperfect, Lotus 1-2-3, dBase III, and other popular business titles. Therefore, their buying decision was generally driven by the price-to-performance ratio, and the performance was mainly a matter of CPU (e.g. 8088 vs. 80286 vs. 80386) and clock speed. This allowed for fairly simple buyer comparisons.



                                  So, the diversity of vendors created little real hassle for buyers (they always got the same basic features and compatibility), while also fostering a highly-competitive market where price/performance ratio fell rapidly. It was the rapidity of falling prices with rising performance that allowed the PC clones to overwhelm the competition from Apple, Commodore, etc.






                                  share|improve this answer

























                                  • I would modify this answer that the diversity came later. Initially, clones were practically exact duplicates of IBM's design. There weren't a lot of expansion cards initially. Thus, there wasn't much hardware to write software for, initially. But the success of the platform and its expansion slots were eventually a draw for developers of 3rd-party add-ons.

                                    – DrSheldon
                                    2 hours ago













                                  1












                                  1








                                  1







                                  The hardware in PC clones was not particularly diverse, if the technical characteristics of the hardware is the distinguishing factor. Yes, there was a great diversity of manufacturers and vendors, but the diversity of features was pretty minimal. It is probably best to describe the PC clones of the 1980s as vanilla computers. And this was their real strength.



                                  Because the hardware was "PC compatible", having the basic features required to make it so, it could be counted on to run the popular MS-DOS applications of the time. This actually liberated the user from having to think very much about the diversity of vendors. Most buyers of PC clones at the time weren't venturing into esoteric hardware for highly specific use-cases, nor were they buying components and assembling a properly configured PC themselves. They just wanted a machine to run Wordperfect, Lotus 1-2-3, dBase III, and other popular business titles. Therefore, their buying decision was generally driven by the price-to-performance ratio, and the performance was mainly a matter of CPU (e.g. 8088 vs. 80286 vs. 80386) and clock speed. This allowed for fairly simple buyer comparisons.



                                  So, the diversity of vendors created little real hassle for buyers (they always got the same basic features and compatibility), while also fostering a highly-competitive market where price/performance ratio fell rapidly. It was the rapidity of falling prices with rising performance that allowed the PC clones to overwhelm the competition from Apple, Commodore, etc.






                                  share|improve this answer















                                  The hardware in PC clones was not particularly diverse, if the technical characteristics of the hardware is the distinguishing factor. Yes, there was a great diversity of manufacturers and vendors, but the diversity of features was pretty minimal. It is probably best to describe the PC clones of the 1980s as vanilla computers. And this was their real strength.



                                  Because the hardware was "PC compatible", having the basic features required to make it so, it could be counted on to run the popular MS-DOS applications of the time. This actually liberated the user from having to think very much about the diversity of vendors. Most buyers of PC clones at the time weren't venturing into esoteric hardware for highly specific use-cases, nor were they buying components and assembling a properly configured PC themselves. They just wanted a machine to run Wordperfect, Lotus 1-2-3, dBase III, and other popular business titles. Therefore, their buying decision was generally driven by the price-to-performance ratio, and the performance was mainly a matter of CPU (e.g. 8088 vs. 80286 vs. 80386) and clock speed. This allowed for fairly simple buyer comparisons.



                                  So, the diversity of vendors created little real hassle for buyers (they always got the same basic features and compatibility), while also fostering a highly-competitive market where price/performance ratio fell rapidly. It was the rapidity of falling prices with rising performance that allowed the PC clones to overwhelm the competition from Apple, Commodore, etc.







                                  share|improve this answer














                                  share|improve this answer



                                  share|improve this answer








                                  edited 6 hours ago

























                                  answered 6 hours ago









                                  Brian HBrian H

                                  20.3k76 silver badges178 bronze badges




                                  20.3k76 silver badges178 bronze badges












                                  • I would modify this answer that the diversity came later. Initially, clones were practically exact duplicates of IBM's design. There weren't a lot of expansion cards initially. Thus, there wasn't much hardware to write software for, initially. But the success of the platform and its expansion slots were eventually a draw for developers of 3rd-party add-ons.

                                    – DrSheldon
                                    2 hours ago

















                                  • I would modify this answer that the diversity came later. Initially, clones were practically exact duplicates of IBM's design. There weren't a lot of expansion cards initially. Thus, there wasn't much hardware to write software for, initially. But the success of the platform and its expansion slots were eventually a draw for developers of 3rd-party add-ons.

                                    – DrSheldon
                                    2 hours ago
















                                  I would modify this answer that the diversity came later. Initially, clones were practically exact duplicates of IBM's design. There weren't a lot of expansion cards initially. Thus, there wasn't much hardware to write software for, initially. But the success of the platform and its expansion slots were eventually a draw for developers of 3rd-party add-ons.

                                  – DrSheldon
                                  2 hours ago





                                  I would modify this answer that the diversity came later. Initially, clones were practically exact duplicates of IBM's design. There weren't a lot of expansion cards initially. Thus, there wasn't much hardware to write software for, initially. But the success of the platform and its expansion slots were eventually a draw for developers of 3rd-party add-ons.

                                  – DrSheldon
                                  2 hours ago











                                  0














                                  First, having the plans available allowed small companies to build their own IBM PC clones. You didn't have to be a monolithic company like IBM to build one. A modern analogy is that you don't have to be Google to design an Android phone.



                                  Second, each of these numerous little companies did their own marketing, greatly expanding the platform's visibility in the marketplace.



                                  Third, these companies often shared technology, reducing development and manufacturing costs through economies of scale. Chipsets, CPUs, BIOS firmware, etc.






                                  share|improve this answer



























                                    0














                                    First, having the plans available allowed small companies to build their own IBM PC clones. You didn't have to be a monolithic company like IBM to build one. A modern analogy is that you don't have to be Google to design an Android phone.



                                    Second, each of these numerous little companies did their own marketing, greatly expanding the platform's visibility in the marketplace.



                                    Third, these companies often shared technology, reducing development and manufacturing costs through economies of scale. Chipsets, CPUs, BIOS firmware, etc.






                                    share|improve this answer

























                                      0












                                      0








                                      0







                                      First, having the plans available allowed small companies to build their own IBM PC clones. You didn't have to be a monolithic company like IBM to build one. A modern analogy is that you don't have to be Google to design an Android phone.



                                      Second, each of these numerous little companies did their own marketing, greatly expanding the platform's visibility in the marketplace.



                                      Third, these companies often shared technology, reducing development and manufacturing costs through economies of scale. Chipsets, CPUs, BIOS firmware, etc.






                                      share|improve this answer













                                      First, having the plans available allowed small companies to build their own IBM PC clones. You didn't have to be a monolithic company like IBM to build one. A modern analogy is that you don't have to be Google to design an Android phone.



                                      Second, each of these numerous little companies did their own marketing, greatly expanding the platform's visibility in the marketplace.



                                      Third, these companies often shared technology, reducing development and manufacturing costs through economies of scale. Chipsets, CPUs, BIOS firmware, etc.







                                      share|improve this answer












                                      share|improve this answer



                                      share|improve this answer










                                      answered 7 hours ago









                                      snips-n-snailssnips-n-snails

                                      10.6k3 gold badges35 silver badges80 bronze badges




                                      10.6k3 gold badges35 silver badges80 bronze badges



























                                          draft saved

                                          draft discarded
















































                                          Thanks for contributing an answer to Retrocomputing Stack Exchange!


                                          • Please be sure to answer the question. Provide details and share your research!

                                          But avoid


                                          • Asking for help, clarification, or responding to other answers.

                                          • Making statements based on opinion; back them up with references or personal experience.

                                          To learn more, see our tips on writing great answers.




                                          draft saved


                                          draft discarded














                                          StackExchange.ready(
                                          function ()
                                          StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fretrocomputing.stackexchange.com%2fquestions%2f11606%2fwhy-was-hardware-diversification-an-asset-for-the-ibm-pc-ecosystem%23new-answer', 'question_page');

                                          );

                                          Post as a guest















                                          Required, but never shown





















































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown

































                                          Required, but never shown














                                          Required, but never shown












                                          Required, but never shown







                                          Required, but never shown







                                          Popular posts from this blog

                                          ParseJSON using SSJSUsing AMPscript with SSJS ActivitiesHow to resubscribe a user in Marketing cloud using SSJS?Pulling Subscriber Status from Lists using SSJSRetrieving Emails using SSJSProblem in updating DE using SSJSUsing SSJS to send single email in Marketing CloudError adding EmailSendDefinition using SSJS

                                          Кампала Садржај Географија Географија Историја Становништво Привреда Партнерски градови Референце Спољашње везе Мени за навигацију0°11′ СГШ; 32°20′ ИГД / 0.18° СГШ; 32.34° ИГД / 0.18; 32.340°11′ СГШ; 32°20′ ИГД / 0.18° СГШ; 32.34° ИГД / 0.18; 32.34МедијиПодациЗванични веб-сајту

                                          19. јануар Садржај Догађаји Рођења Смрти Празници и дани сећања Види још Референце Мени за навигацијуу