Which approach can I use to generate text based on multiple inputs?AI that can generate programsWhat is the machine learning approach based on human learning?Can anyone suggest a small application based on an Artificial Intelligence which can be done by a beginner in AI?Can we combine multiple different neural networks in one?Approach to classify a photo and extract text from itLoading multiple trained models for use in multi-agent environmentWhat methods are there to generate artificial training examples based on existing training examples?Which libraries can be used for image caption generation?Generate QA dataset from large text corpusCan GANs be used to generate matching pairs to inputs?

Why is drive/partition number still used?

Use cases for M-0 & C-0?

Why force the nose of 737 Max down in the first place?

Am I allowed to use personal conversation as a source?

Sea level static test of an upper stage possible?

If Trump gets impeached, how long would Pence be president?

How to judge a Ph.D. applicant that arrives "out of thin air"

Why do all my history books divide Chinese history after the Han dynasty?

Is this photo showing a woman standing in the nude before teenagers real?

How could Nomadic scholars effectively memorize libraries worth of information

(2 of 11: Moon-or-Sun) What is Pyramid Cult's Favorite Camera?

Why is it considered Acid Rain with pH <5.6

Why isn't there any 9.5 digit multimeter or higher?

What do you call a flexible diving platform?

Why did House of Representatives need to condemn Trumps Tweets?

How much were the LMs maneuvered to their landing points?

Checking if an integer is a member of an integer list

Polyhedra, Polyhedron, Polytopes and Polygon

Assuring luggage isn't lost with short layover

Seaborn style plot of pandas dataframe

Is it legal to use cash pulled from a credit card to pay the monthly payment on that credit card?

Are the named pipe created by `mknod` and the FIFO created by `mkfifo` equivalent?

The best place for swimming in Arctic Ocean

Decreasing star count



Which approach can I use to generate text based on multiple inputs?


AI that can generate programsWhat is the machine learning approach based on human learning?Can anyone suggest a small application based on an Artificial Intelligence which can be done by a beginner in AI?Can we combine multiple different neural networks in one?Approach to classify a photo and extract text from itLoading multiple trained models for use in multi-agent environmentWhat methods are there to generate artificial training examples based on existing training examples?Which libraries can be used for image caption generation?Generate QA dataset from large text corpusCan GANs be used to generate matching pairs to inputs?






.everyoneloves__top-leaderboard:empty,.everyoneloves__mid-leaderboard:empty,.everyoneloves__bot-mid-leaderboard:empty margin-bottom:0;








3












$begingroup$


I have a little experience in building various models, but I've never created anything like this, so just wondering if I can be pointed in the right direction.



I want to create (in python) a model which will generate text based on multiple inputs, varying from text input (vectorized) to timestamp and integer inputs.



For example, in the training data, the input might include:



eventType = ShotMade



shotType = 2



homeTeamScore = 2



awayTeamScore = 8



player = JR Smith



assist = George Hill



period = 1



and the output might be (possibly minus the hashtags):
JR Smith under the basket for 2! 8-4 CLE. #NBAonBTV #ThisIsWhyWePlay #PlayByPlayEveryDay #NBAFinals



or



JR Smith out here doing #WhateverItTakes to make Cavs fans forgive him. #NBAFinals



Where is the best place to look to get a good knowledge of how to do this?










share|improve this question









New contributor



Hdot is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.






$endgroup$


















    3












    $begingroup$


    I have a little experience in building various models, but I've never created anything like this, so just wondering if I can be pointed in the right direction.



    I want to create (in python) a model which will generate text based on multiple inputs, varying from text input (vectorized) to timestamp and integer inputs.



    For example, in the training data, the input might include:



    eventType = ShotMade



    shotType = 2



    homeTeamScore = 2



    awayTeamScore = 8



    player = JR Smith



    assist = George Hill



    period = 1



    and the output might be (possibly minus the hashtags):
    JR Smith under the basket for 2! 8-4 CLE. #NBAonBTV #ThisIsWhyWePlay #PlayByPlayEveryDay #NBAFinals



    or



    JR Smith out here doing #WhateverItTakes to make Cavs fans forgive him. #NBAFinals



    Where is the best place to look to get a good knowledge of how to do this?










    share|improve this question









    New contributor



    Hdot is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
    Check out our Code of Conduct.






    $endgroup$














      3












      3








      3





      $begingroup$


      I have a little experience in building various models, but I've never created anything like this, so just wondering if I can be pointed in the right direction.



      I want to create (in python) a model which will generate text based on multiple inputs, varying from text input (vectorized) to timestamp and integer inputs.



      For example, in the training data, the input might include:



      eventType = ShotMade



      shotType = 2



      homeTeamScore = 2



      awayTeamScore = 8



      player = JR Smith



      assist = George Hill



      period = 1



      and the output might be (possibly minus the hashtags):
      JR Smith under the basket for 2! 8-4 CLE. #NBAonBTV #ThisIsWhyWePlay #PlayByPlayEveryDay #NBAFinals



      or



      JR Smith out here doing #WhateverItTakes to make Cavs fans forgive him. #NBAFinals



      Where is the best place to look to get a good knowledge of how to do this?










      share|improve this question









      New contributor



      Hdot is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.






      $endgroup$




      I have a little experience in building various models, but I've never created anything like this, so just wondering if I can be pointed in the right direction.



      I want to create (in python) a model which will generate text based on multiple inputs, varying from text input (vectorized) to timestamp and integer inputs.



      For example, in the training data, the input might include:



      eventType = ShotMade



      shotType = 2



      homeTeamScore = 2



      awayTeamScore = 8



      player = JR Smith



      assist = George Hill



      period = 1



      and the output might be (possibly minus the hashtags):
      JR Smith under the basket for 2! 8-4 CLE. #NBAonBTV #ThisIsWhyWePlay #PlayByPlayEveryDay #NBAFinals



      or



      JR Smith out here doing #WhateverItTakes to make Cavs fans forgive him. #NBAFinals



      Where is the best place to look to get a good knowledge of how to do this?







      neural-networks deep-learning python generative-model






      share|improve this question









      New contributor



      Hdot is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.










      share|improve this question









      New contributor



      Hdot is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.








      share|improve this question




      share|improve this question








      edited 9 hours ago









      nbro

      5,6884 gold badges15 silver badges32 bronze badges




      5,6884 gold badges15 silver badges32 bronze badges






      New contributor



      Hdot is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.








      asked 9 hours ago









      HdotHdot

      162 bronze badges




      162 bronze badges




      New contributor



      Hdot is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.




      New contributor




      Hdot is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
      Check out our Code of Conduct.






















          1 Answer
          1






          active

          oldest

          votes


















          2












          $begingroup$

          Generally, text generators work by modeling the joint distribution of the text by its Bayesian forward decomposition



          $
          beginalign*
          p(w_1, w_2, ..., w_n) &= p(w_1) * p(w_2|w_1) * p(w_3|w_2, w_1) * ... * p(w_n|w_i_i<n)\
          &= prod_i=1^n p(w_i|w_k_k<i)\
          endalign*
          $



          From a modeling perspective, this looks right up RNN's ally, where you can have a state holding information from $w_k_k<i$ to learn a representation of $w_i$



          Now, in your specific case, you're interested in a conditional text-generator, so you are trying to model $p(w_1, w_2, ..., w_n | v_j_j)$, but this same tactic works.



          $
          beginalign*
          p(w_1, w_2, ..., w_n| v_j_j) &= p(w_1|v_j_j) * p(w_2|w_1, v_j_j) * p(w_3|w_2, w_1, v_j_j) * ... * p(w_n|w_i_i<n, v_j_j)\
          &= prod_i=1^n p(w_i|w_k_k<i, v_j_j)\
          endalign*
          $



          So, in your RNN or forward-based model, you can use the exact same approach just additionally embed the conditional inputs you have and somehow infuse it into the model (in practice, I have seen this through attention, concatenation, or some other common approach).



          My recommendation (depending on the computational power you have) is to take advantage of the recent fad of pre-trained language models. Specifically, ones trained on next word prediction will probably do the job best. A good example is gpt-2, and, if you check out their GitHub, their code is very readable and easy to adjust for adding conditional input in the ways I have described.






          share|improve this answer











          $endgroup$















            Your Answer








            StackExchange.ready(function()
            var channelOptions =
            tags: "".split(" "),
            id: "658"
            ;
            initTagRenderer("".split(" "), "".split(" "), channelOptions);

            StackExchange.using("externalEditor", function()
            // Have to fire editor after snippets, if snippets enabled
            if (StackExchange.settings.snippets.snippetsEnabled)
            StackExchange.using("snippets", function()
            createEditor();
            );

            else
            createEditor();

            );

            function createEditor()
            StackExchange.prepareEditor(
            heartbeatType: 'answer',
            autoActivateHeartbeat: false,
            convertImagesToLinks: false,
            noModals: true,
            showLowRepImageUploadWarning: true,
            reputationToPostImages: null,
            bindNavPrevention: true,
            postfix: "",
            imageUploader:
            brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
            contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
            allowUrls: true
            ,
            noCode: true, onDemand: true,
            discardSelector: ".discard-answer"
            ,immediatelyShowMarkdownHelp:true
            );



            );






            Hdot is a new contributor. Be nice, and check out our Code of Conduct.









            draft saved

            draft discarded


















            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fai.stackexchange.com%2fquestions%2f13633%2fwhich-approach-can-i-use-to-generate-text-based-on-multiple-inputs%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown

























            1 Answer
            1






            active

            oldest

            votes








            1 Answer
            1






            active

            oldest

            votes









            active

            oldest

            votes






            active

            oldest

            votes









            2












            $begingroup$

            Generally, text generators work by modeling the joint distribution of the text by its Bayesian forward decomposition



            $
            beginalign*
            p(w_1, w_2, ..., w_n) &= p(w_1) * p(w_2|w_1) * p(w_3|w_2, w_1) * ... * p(w_n|w_i_i<n)\
            &= prod_i=1^n p(w_i|w_k_k<i)\
            endalign*
            $



            From a modeling perspective, this looks right up RNN's ally, where you can have a state holding information from $w_k_k<i$ to learn a representation of $w_i$



            Now, in your specific case, you're interested in a conditional text-generator, so you are trying to model $p(w_1, w_2, ..., w_n | v_j_j)$, but this same tactic works.



            $
            beginalign*
            p(w_1, w_2, ..., w_n| v_j_j) &= p(w_1|v_j_j) * p(w_2|w_1, v_j_j) * p(w_3|w_2, w_1, v_j_j) * ... * p(w_n|w_i_i<n, v_j_j)\
            &= prod_i=1^n p(w_i|w_k_k<i, v_j_j)\
            endalign*
            $



            So, in your RNN or forward-based model, you can use the exact same approach just additionally embed the conditional inputs you have and somehow infuse it into the model (in practice, I have seen this through attention, concatenation, or some other common approach).



            My recommendation (depending on the computational power you have) is to take advantage of the recent fad of pre-trained language models. Specifically, ones trained on next word prediction will probably do the job best. A good example is gpt-2, and, if you check out their GitHub, their code is very readable and easy to adjust for adding conditional input in the ways I have described.






            share|improve this answer











            $endgroup$

















              2












              $begingroup$

              Generally, text generators work by modeling the joint distribution of the text by its Bayesian forward decomposition



              $
              beginalign*
              p(w_1, w_2, ..., w_n) &= p(w_1) * p(w_2|w_1) * p(w_3|w_2, w_1) * ... * p(w_n|w_i_i<n)\
              &= prod_i=1^n p(w_i|w_k_k<i)\
              endalign*
              $



              From a modeling perspective, this looks right up RNN's ally, where you can have a state holding information from $w_k_k<i$ to learn a representation of $w_i$



              Now, in your specific case, you're interested in a conditional text-generator, so you are trying to model $p(w_1, w_2, ..., w_n | v_j_j)$, but this same tactic works.



              $
              beginalign*
              p(w_1, w_2, ..., w_n| v_j_j) &= p(w_1|v_j_j) * p(w_2|w_1, v_j_j) * p(w_3|w_2, w_1, v_j_j) * ... * p(w_n|w_i_i<n, v_j_j)\
              &= prod_i=1^n p(w_i|w_k_k<i, v_j_j)\
              endalign*
              $



              So, in your RNN or forward-based model, you can use the exact same approach just additionally embed the conditional inputs you have and somehow infuse it into the model (in practice, I have seen this through attention, concatenation, or some other common approach).



              My recommendation (depending on the computational power you have) is to take advantage of the recent fad of pre-trained language models. Specifically, ones trained on next word prediction will probably do the job best. A good example is gpt-2, and, if you check out their GitHub, their code is very readable and easy to adjust for adding conditional input in the ways I have described.






              share|improve this answer











              $endgroup$















                2












                2








                2





                $begingroup$

                Generally, text generators work by modeling the joint distribution of the text by its Bayesian forward decomposition



                $
                beginalign*
                p(w_1, w_2, ..., w_n) &= p(w_1) * p(w_2|w_1) * p(w_3|w_2, w_1) * ... * p(w_n|w_i_i<n)\
                &= prod_i=1^n p(w_i|w_k_k<i)\
                endalign*
                $



                From a modeling perspective, this looks right up RNN's ally, where you can have a state holding information from $w_k_k<i$ to learn a representation of $w_i$



                Now, in your specific case, you're interested in a conditional text-generator, so you are trying to model $p(w_1, w_2, ..., w_n | v_j_j)$, but this same tactic works.



                $
                beginalign*
                p(w_1, w_2, ..., w_n| v_j_j) &= p(w_1|v_j_j) * p(w_2|w_1, v_j_j) * p(w_3|w_2, w_1, v_j_j) * ... * p(w_n|w_i_i<n, v_j_j)\
                &= prod_i=1^n p(w_i|w_k_k<i, v_j_j)\
                endalign*
                $



                So, in your RNN or forward-based model, you can use the exact same approach just additionally embed the conditional inputs you have and somehow infuse it into the model (in practice, I have seen this through attention, concatenation, or some other common approach).



                My recommendation (depending on the computational power you have) is to take advantage of the recent fad of pre-trained language models. Specifically, ones trained on next word prediction will probably do the job best. A good example is gpt-2, and, if you check out their GitHub, their code is very readable and easy to adjust for adding conditional input in the ways I have described.






                share|improve this answer











                $endgroup$



                Generally, text generators work by modeling the joint distribution of the text by its Bayesian forward decomposition



                $
                beginalign*
                p(w_1, w_2, ..., w_n) &= p(w_1) * p(w_2|w_1) * p(w_3|w_2, w_1) * ... * p(w_n|w_i_i<n)\
                &= prod_i=1^n p(w_i|w_k_k<i)\
                endalign*
                $



                From a modeling perspective, this looks right up RNN's ally, where you can have a state holding information from $w_k_k<i$ to learn a representation of $w_i$



                Now, in your specific case, you're interested in a conditional text-generator, so you are trying to model $p(w_1, w_2, ..., w_n | v_j_j)$, but this same tactic works.



                $
                beginalign*
                p(w_1, w_2, ..., w_n| v_j_j) &= p(w_1|v_j_j) * p(w_2|w_1, v_j_j) * p(w_3|w_2, w_1, v_j_j) * ... * p(w_n|w_i_i<n, v_j_j)\
                &= prod_i=1^n p(w_i|w_k_k<i, v_j_j)\
                endalign*
                $



                So, in your RNN or forward-based model, you can use the exact same approach just additionally embed the conditional inputs you have and somehow infuse it into the model (in practice, I have seen this through attention, concatenation, or some other common approach).



                My recommendation (depending on the computational power you have) is to take advantage of the recent fad of pre-trained language models. Specifically, ones trained on next word prediction will probably do the job best. A good example is gpt-2, and, if you check out their GitHub, their code is very readable and easy to adjust for adding conditional input in the ways I have described.







                share|improve this answer














                share|improve this answer



                share|improve this answer








                edited 7 hours ago









                nbro

                5,6884 gold badges15 silver badges32 bronze badges




                5,6884 gold badges15 silver badges32 bronze badges










                answered 8 hours ago









                mshlismshlis

                9351 silver badge14 bronze badges




                9351 silver badge14 bronze badges




















                    Hdot is a new contributor. Be nice, and check out our Code of Conduct.









                    draft saved

                    draft discarded


















                    Hdot is a new contributor. Be nice, and check out our Code of Conduct.












                    Hdot is a new contributor. Be nice, and check out our Code of Conduct.











                    Hdot is a new contributor. Be nice, and check out our Code of Conduct.














                    Thanks for contributing an answer to Artificial Intelligence Stack Exchange!


                    • Please be sure to answer the question. Provide details and share your research!

                    But avoid


                    • Asking for help, clarification, or responding to other answers.

                    • Making statements based on opinion; back them up with references or personal experience.

                    Use MathJax to format equations. MathJax reference.


                    To learn more, see our tips on writing great answers.




                    draft saved


                    draft discarded














                    StackExchange.ready(
                    function ()
                    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fai.stackexchange.com%2fquestions%2f13633%2fwhich-approach-can-i-use-to-generate-text-based-on-multiple-inputs%23new-answer', 'question_page');

                    );

                    Post as a guest















                    Required, but never shown





















































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown

































                    Required, but never shown














                    Required, but never shown












                    Required, but never shown







                    Required, but never shown







                    Popular posts from this blog

                    19. јануар Садржај Догађаји Рођења Смрти Празници и дани сећања Види још Референце Мени за навигацијуу

                    Israel Cuprins Etimologie | Istorie | Geografie | Politică | Demografie | Educație | Economie | Cultură | Note explicative | Note bibliografice | Bibliografie | Legături externe | Meniu de navigaresite web oficialfacebooktweeterGoogle+Instagramcanal YouTubeInstagramtextmodificaremodificarewww.technion.ac.ilnew.huji.ac.ilwww.weizmann.ac.ilwww1.biu.ac.ilenglish.tau.ac.ilwww.haifa.ac.ilin.bgu.ac.ilwww.openu.ac.ilwww.ariel.ac.ilCIA FactbookHarta Israelului"Negotiating Jerusalem," Palestine–Israel JournalThe Schizoid Nature of Modern Hebrew: A Slavic Language in Search of a Semitic Past„Arabic in Israel: an official language and a cultural bridge”„Latest Population Statistics for Israel”„Israel Population”„Tables”„Report for Selected Countries and Subjects”Human Development Report 2016: Human Development for Everyone„Distribution of family income - Gini index”The World FactbookJerusalem Law„Israel”„Israel”„Zionist Leaders: David Ben-Gurion 1886–1973”„The status of Jerusalem”„Analysis: Kadima's big plans”„Israel's Hard-Learned Lessons”„The Legacy of Undefined Borders, Tel Aviv Notes No. 40, 5 iunie 2002”„Israel Journal: A Land Without Borders”„Population”„Israel closes decade with population of 7.5 million”Time Series-DataBank„Selected Statistics on Jerusalem Day 2007 (Hebrew)”Golan belongs to Syria, Druze protestGlobal Survey 2006: Middle East Progress Amid Global Gains in FreedomWHO: Life expectancy in Israel among highest in the worldInternational Monetary Fund, World Economic Outlook Database, April 2011: Nominal GDP list of countries. Data for the year 2010.„Israel's accession to the OECD”Popular Opinion„On the Move”Hosea 12:5„Walking the Bible Timeline”„Palestine: History”„Return to Zion”An invention called 'the Jewish people' – Haaretz – Israel NewsoriginalJewish and Non-Jewish Population of Palestine-Israel (1517–2004)ImmigrationJewishvirtuallibrary.orgChapter One: The Heralders of Zionism„The birth of modern Israel: A scrap of paper that changed history”„League of Nations: The Mandate for Palestine, 24 iulie 1922”The Population of Palestine Prior to 1948originalBackground Paper No. 47 (ST/DPI/SER.A/47)History: Foreign DominationTwo Hundred and Seventh Plenary Meeting„Israel (Labor Zionism)”Population, by Religion and Population GroupThe Suez CrisisAdolf EichmannJustice Ministry Reply to Amnesty International Report„The Interregnum”Israel Ministry of Foreign Affairs – The Palestinian National Covenant- July 1968Research on terrorism: trends, achievements & failuresThe Routledge Atlas of the Arab–Israeli conflict: The Complete History of the Struggle and the Efforts to Resolve It"George Habash, Palestinian Terrorism Tactician, Dies at 82."„1973: Arab states attack Israeli forces”Agranat Commission„Has Israel Annexed East Jerusalem?”original„After 4 Years, Intifada Still Smolders”From the End of the Cold War to 2001originalThe Oslo Accords, 1993Israel-PLO Recognition – Exchange of Letters between PM Rabin and Chairman Arafat – Sept 9- 1993Foundation for Middle East PeaceSources of Population Growth: Total Israeli Population and Settler Population, 1991–2003original„Israel marks Rabin assassination”The Wye River Memorandumoriginal„West Bank barrier route disputed, Israeli missile kills 2”"Permanent Ceasefire to Be Based on Creation Of Buffer Zone Free of Armed Personnel Other than UN, Lebanese Forces"„Hezbollah kills 8 soldiers, kidnaps two in offensive on northern border”„Olmert confirms peace talks with Syria”„Battleground Gaza: Israeli ground forces invade the strip”„IDF begins Gaza troop withdrawal, hours after ending 3-week offensive”„THE LAND: Geography and Climate”„Area of districts, sub-districts, natural regions and lakes”„Israel - Geography”„Makhteshim Country”Israel and the Palestinian Territories„Makhtesh Ramon”„The Living Dead Sea”„Temperatures reach record high in Pakistan”„Climate Extremes In Israel”Israel in figures„Deuteronom”„JNF: 240 million trees planted since 1901”„Vegetation of Israel and Neighboring Countries”Environmental Law in Israel„Executive branch”„Israel's election process explained”„The Electoral System in Israel”„Constitution for Israel”„All 120 incoming Knesset members”„Statul ISRAEL”„The Judiciary: The Court System”„Israel's high court unique in region”„Israel and the International Criminal Court: A Legal Battlefield”„Localities and population, by population group, district, sub-district and natural region”„Israel: Districts, Major Cities, Urban Localities & Metropolitan Areas”„Israel-Egypt Relations: Background & Overview of Peace Treaty”„Solana to Haaretz: New Rules of War Needed for Age of Terror”„Israel's Announcement Regarding Settlements”„United Nations Security Council Resolution 497”„Security Council resolution 478 (1980) on the status of Jerusalem”„Arabs will ask U.N. to seek razing of Israeli wall”„Olmert: Willing to trade land for peace”„Mapping Peace between Syria and Israel”„Egypt: Israel must accept the land-for-peace formula”„Israel: Age structure from 2005 to 2015”„Global, regional, and national disability-adjusted life years (DALYs) for 306 diseases and injuries and healthy life expectancy (HALE) for 188 countries, 1990–2013: quantifying the epidemiological transition”10.1016/S0140-6736(15)61340-X„World Health Statistics 2014”„Life expectancy for Israeli men world's 4th highest”„Family Structure and Well-Being Across Israel's Diverse Population”„Fertility among Jewish and Muslim Women in Israel, by Level of Religiosity, 1979-2009”„Israel leaders in birth rate, but poverty major challenge”„Ethnic Groups”„Israel's population: Over 8.5 million”„Israel - Ethnic groups”„Jews, by country of origin and age”„Minority Communities in Israel: Background & Overview”„Israel”„Language in Israel”„Selected Data from the 2011 Social Survey on Mastery of the Hebrew Language and Usage of Languages”„Religions”„5 facts about Israeli Druze, a unique religious and ethnic group”„Israël”Israel Country Study Guide„Haredi city in Negev – blessing or curse?”„New town Harish harbors hopes of being more than another Pleasantville”„List of localities, in alphabetical order”„Muncitorii români, doriți în Israel”„Prietenia româno-israeliană la nevoie se cunoaște”„The Higher Education System in Israel”„Middle East”„Academic Ranking of World Universities 2016”„Israel”„Israel”„Jewish Nobel Prize Winners”„All Nobel Prizes in Literature”„All Nobel Peace Prizes”„All Prizes in Economic Sciences”„All Nobel Prizes in Chemistry”„List of Fields Medallists”„Sakharov Prize”„Țara care și-a sfidat "destinul" și se bate umăr la umăr cu Silicon Valley”„Apple's R&D center in Israel grew to about 800 employees”„Tim Cook: Apple's Herzliya R&D center second-largest in world”„Lecții de economie de la Israel”„Land use”Israel Investment and Business GuideA Country Study: IsraelCentral Bureau of StatisticsFlorin Diaconu, „Kadima: Flexibilitate și pragmatism, dar nici un compromis în chestiuni vitale", în Revista Institutului Diplomatic Român, anul I, numărul I, semestrul I, 2006, pp. 71-72Florin Diaconu, „Likud: Dreapta israeliană constant opusă retrocedării teritoriilor cureite prin luptă în 1967", în Revista Institutului Diplomatic Român, anul I, numărul I, semestrul I, 2006, pp. 73-74MassadaIsraelul a crescut in 50 de ani cât alte state intr-un mileniuIsrael Government PortalIsraelIsraelIsraelmmmmmXX451232cb118646298(data)4027808-634110000 0004 0372 0767n7900328503691455-bb46-37e3-91d2-cb064a35ffcc1003570400564274ge1294033523775214929302638955X146498911146498911

                    Кастелфранко ди Сопра Становништво Референце Спољашње везе Мени за навигацију43°37′18″ СГШ; 11°33′32″ ИГД / 43.62156° СГШ; 11.55885° ИГД / 43.62156; 11.5588543°37′18″ СГШ; 11°33′32″ ИГД / 43.62156° СГШ; 11.55885° ИГД / 43.62156; 11.558853179688„The GeoNames geographical database”„Istituto Nazionale di Statistica”проширитиууWorldCat156923403n850174324558639-1cb14643287r(подаци)